top of page

00:00:02:11 - 00:00:04:04
Good morning,

00:00:04:04 - 00:00:06:08
welcome to Data governance

00:00:06:08 - 00:00:09:08
Forum at Inno Police

00:00:09:10 - 00:00:14:07
for a highly important subject,
a social subject.

00:00:14:09 - 00:00:19:10
A subject that brought us together
two years ago already in Le Havre

00:00:19:12 - 00:00:23:12
to think about this
highly important

00:00:23:14 - 00:00:26:01
of data governance.

00:00:26:01 - 00:00:28:18
But why
data governance?

00:00:28:18 - 00:00:32:12
Well, because we already understood,
two years ago

00:00:32:14 - 00:00:34:28
that digital

00:00:34:28 - 00:00:36:19
quickly became

00:00:36:19 - 00:00:40:15
in our company Ubik Twitter.

00:00:40:17 - 00:00:42:10
What does it mean ?

00:00:42:10 - 00:00:45:25
Well, actually it's everywhere,

00:00:45:28 - 00:00:48:16
It is in everyday life,

00:00:48:16 - 00:00:51:08
it is in our very thoughts.

00:00:51:08 - 00:00:55:28
And today,
we realize that it goes very quickly

00:00:56:00 - 00:00:58:08
and that unfortunately,

00:00:58:08 - 00:01:02:23
there aren't really any code rules
which surrounds

00:01:02:25 - 00:01:05:03
this digitization.

00:01:05:03 - 00:01:09:08
Digitalization of our society.

00:01:09:10 - 00:01:13:25
However, digital

00:01:13:27 - 00:01:15:14
is originally,

00:01:15:14 - 00:01:17:24
I would say, a real real change

00:01:17:24 - 00:01:21:07
of society, as important

00:01:21:10 - 00:01:26:23
that the invention of writing
5000 years ago.

00:01:26:25 - 00:01:28:11
For what ?

00:01:28:11 - 00:01:30:24
Because he saw again, He has.

00:01:30:24 - 00:01:34:05
He is behind a complete overhaul

00:01:34:07 - 00:01:37:04
of our information system.

00:01:37:04 - 00:01:40:11
The information system
which allows a human being to communicate

00:01:40:11 - 00:01:43:22
with a human being
and live in society.

00:01:43:25 - 00:01:46:15
And today,
this digital which is ubiquitous

00:01:46:15 - 00:01:50:19
is completely in progress
to change our communication models,

00:01:50:22 - 00:01:55:06
from human to human and to live in society.

00:01:55:08 - 00:01:58:01
I give examples.

00:01:58:01 - 00:01:59:26
The consequence of writing,

00:01:59:26 - 00:02:03:23
of invention, of writing,
what is it in our society?

00:02:03:26 - 00:02:10:02
Democracy,
the religion of writing, law,

00:02:10:05 - 00:02:12:08
Mathematics,

00:02:12:08 - 00:02:14:04
commerce.

00:02:14:04 - 00:02:16:09
In short, we see that in fact

00:02:16:09 - 00:02:21:04
our entire society is experiencing
a real laugh, this

00:02:21:07 - 00:02:23:29
information system recipe

00:02:23:29 - 00:02:26:06
and which induces

00:02:26:06 - 00:02:29:06
a real civilizational change

00:02:29:08 - 00:02:33:10
Of our society.

00:02:33:12 - 00:02:35:12
It is therefore necessary to ask yourself.

00:02:35:12 - 00:02:39:01
This is what we said to each other in Le Havre
two years ago and we have been working ever since.

00:02:39:03 - 00:02:43:06
It's time to settle down,
to ask good questions,

00:02:43:08 - 00:02:46:17
questions around governance,
of the data

00:02:46:19 - 00:02:50:12
and the issue of sovereignty and freedom

00:02:50:14 - 00:02:54:27
give two or three examples for
you understood this issue.

00:02:54:29 - 00:02:56:07
Not long ago,

00:02:56:07 - 00:03:00:03
my father died in Paris.

00:03:00:05 - 00:03:04:15
One year later,
I receive a tax from the City of Paris.

00:03:04:17 - 00:03:08:07
I tell myself your apartment is unoccupied,

00:03:08:09 - 00:03:11:05
here is €600 in taxes.

00:03:11:05 - 00:03:12:11
I said to myself, but how?

00:03:12:11 - 00:03:14:11
They knew ?

00:03:14:11 - 00:03:16:26
Well, as I had
good relationships,

00:03:16:26 - 00:03:22:11
I understood that in fact the person responsible,
the entity responsible for water meters

00:03:22:14 - 00:03:26:14
had sold all the data
Parisians

00:03:26:16 - 00:03:31:13
to the City of Paris to cop,
in quotation marks, follow, trace

00:03:31:16 - 00:03:34:04
what were the accommodations
which were unoccupied,

00:03:34:04 - 00:03:38:18
that is to say where there had not been a
times water consumption.

00:03:38:21 - 00:03:41:20
Do we want this company?

00:03:41:20 - 00:03:43:08
Because am I giving you an example?

00:03:43:08 - 00:03:47:08
I have nothing against taxation
unoccupied housing, but it is

00:03:47:11 - 00:03:49:06
the way.

00:03:49:06 - 00:03:51:28
And this way is what we want it to be.

00:03:51:28 - 00:03:55:06
In fact, we enter into a society
where we are traced,

00:03:55:08 - 00:03:58:16
traced to every moment of our lives.

00:03:58:18 - 00:04:00:10
However, if he

00:04:00:10 - 00:04:04:05
there is no real governance
around this traceability,

00:04:04:07 - 00:04:08:22
we must question ourselves
because it is our freedom that is at stake.

00:04:08:24 - 00:04:10:26
And I'm just giving you a thought
that I had

00:04:10:26 - 00:04:13:26
before handing over to my colleagues

00:04:13:27 - 00:04:18:14
with whom we worked
on the rewriting of rights

00:04:18:17 - 00:04:22:22
and responsibilities
of humans in the digital age.

00:04:22:25 - 00:04:25:19
A short time ago, I was in Paris

00:04:25:19 - 00:04:30:04
and then suddenly,
I cross a little outside the

00:04:30:07 - 00:04:31:04
nails and

00:04:31:04 - 00:04:34:28
I tell you But tomorrow, if it happens,
I'm going to fine you.

00:04:35:00 - 00:04:38:25
Because in fact, we saw with my
laptop that I was out of the loop.

00:04:38:27 - 00:04:41:21
And then I said to myself
but there, I bought

00:04:41:21 - 00:04:44:29
a gift for my wife,
but actually, she's going to find out.

00:04:44:29 - 00:04:46:28
Then everyone will know.

00:04:46:28 - 00:04:49:29
And then I'm going to eat this
in such a restaurant.

00:04:50:06 - 00:04:51:04
We'll find out.

00:04:51:04 - 00:04:54:10
And in fact,
I told myself but in fact, I am traced

00:04:54:12 - 00:04:57:11
on my whole life, on everything I do.

00:04:57:11 - 00:04:59:27
Why did I get money here?

00:04:59:27 - 00:05:01:21
To do what ?

00:05:01:21 - 00:05:06:11
And there, I say to myself this company,
I do not want.

00:05:06:13 - 00:05:07:22
It is not possible.

00:05:07:22 - 00:05:11:28
A life, a society where
everything I do, everything I think,

00:05:12:03 - 00:05:17:07
is drawn at the moment
present, in a perpetual manner.

00:05:17:10 - 00:05:21:13
I no longer have any freedom of thought,
no freedom to act.

00:05:21:15 - 00:05:24:15
It's George Orwell, 1984.

00:05:24:22 - 00:05:27:28
Well, I don't want this company.

00:05:28:01 - 00:05:32:12
Hence this
rewriting of human rights

00:05:32:18 - 00:05:35:07
which we called human rights
in the digital age.

00:05:35:07 - 00:05:36:22
For what ?

00:05:36:22 - 00:05:40:02
Because, in fact, in 1789,

00:05:40:05 - 00:05:43:05
citizens said to themselves

00:05:43:12 - 00:05:47:21
Democracy is good,
but it's time to actually ask

00:05:47:23 - 00:05:52:13
the foundations of a society

00:05:52:16 - 00:05:54:26
and finally

00:05:54:26 - 00:05:56:29
to balance

00:05:56:29 - 00:05:59:29
a society where there are rules

00:06:00:03 - 00:06:04:03
and where we respect human beings.

00:06:04:06 - 00:06:05:19
However,

00:06:05:19 - 00:06:10:16
at that time, if digital technology did not exist
not, today we are talking about double

00:06:10:16 - 00:06:15:04
digital, of a person, of a building,
of a city, of whatever we want.

00:06:15:07 - 00:06:19:01
But we see that in fact
this digital double

00:06:19:03 - 00:06:21:18
acts directly, governs

00:06:21:18 - 00:06:24:17
the physical and not the other way around.

00:06:24:17 - 00:06:27:01
So if we haven't established rules

00:06:27:01 - 00:06:31:10
to control this virtual world,

00:06:31:12 - 00:06:34:28
well we completely lose our freedom.

00:06:35:01 - 00:06:39:12
So what was written in 1789,
Liberty egality,

00:06:39:12 - 00:06:43:08
fraternity
is now in the digital age.

00:06:43:10 - 00:06:45:24
We can still say it.

00:06:45:27 - 00:06:46:11
Are we

00:06:46:11 - 00:06:49:11
sure that indeed
are all the conditions?

00:06:49:18 - 00:06:51:27
And when we are great people in this world,

00:06:51:27 - 00:06:55:13
claim to respect
human rights ?

00:06:55:16 - 00:06:58:07
Is that in fact
do they really respect them?

00:06:58:07 - 00:07:00:13
Or does their society respect them?

00:07:00:13 - 00:07:05:22
Since in fact we see clearly that the rules
which govern the virtual world

00:07:05:24 - 00:07:08:15
are not written

00:07:08:15 - 00:07:10:24
and that’s the work we did.

00:07:10:24 - 00:07:14:27
And so I have the pleasure for this purpose
to speak out

00:07:15:00 - 00:07:19:16
to several people, personalities
who will present themselves quickly

00:07:19:18 - 00:07:23:10
and who will say why
we all worked on this subject.

00:07:23:11 - 00:07:27:09
This rewriting of human rights
that we called

00:07:27:09 - 00:07:31:01
Bill of Rights
and human responsibilities

00:07:31:03 - 00:07:33:08
in the digital age.

00:07:33:08 - 00:07:36:04
Thank you Richard Collin.

00:07:36:04 - 00:07:38:00
Your turn to speak !

00:07:38:00 - 00:07:42:22
Thank you Emmanuel. Hello everyone.

00:07:42:24 - 00:07:44:18
What just said is important.

00:07:44:18 - 00:07:47:02
We realize where I'm talking to you from.

00:07:47:02 - 00:07:51:04
I am a general delegate
from a site called transitions,

00:07:51:07 - 00:07:54:28
who is working on the issue
convergence and acceleration

00:07:54:28 - 00:07:58:10
economic and democratic transitions,

00:07:58:10 - 00:08:01:21
societal, energy and digital.

00:08:01:24 - 00:08:05:09
We have big partners
and ahead of that.

00:08:05:12 - 00:08:09:10
I say when in Le Havre,
we started working, it seemed to me

00:08:09:10 - 00:08:14:28
an important thing and we surrendered
consider that the things you know,

00:08:15:01 - 00:08:17:21
is that ultimately,
the data, they are.

00:08:17:21 - 00:08:23:00
It is we who are
consubstantial with what we are.

00:08:23:03 - 00:08:26:13
Because that's the meaning, that's
intelligence.

00:08:26:15 - 00:08:30:19
This is ultimately what characterizes us,

00:08:30:21 - 00:08:33:17
what takes us on,
what makes us live.

00:08:33:17 - 00:08:35:21
We are finally

00:08:35:21 - 00:08:38:17
Datas

00:08:38:17 - 00:08:40:21
and unfortunately, this data,

00:08:40:21 - 00:08:45:15
we give them
when we would be better off lending them.

00:08:45:18 - 00:08:48:08
And I think that’s the issue.

00:08:48:08 - 00:08:50:13
The issue is that we must consider

00:08:50:13 - 00:08:53:21
that our data,
we must not give them away.

00:08:53:23 - 00:08:55:20
We must be masters of it since it is

00:08:55:20 - 00:08:58:29
consubstantial with what we are

00:08:59:02 - 00:09:02:28
and that in this important logic
freedom, autonomy

00:09:03:01 - 00:09:07:16
and avoid deductions in which
we are embarking at the moment

00:09:07:18 - 00:09:11:05
and now,
it's to go in that direction.

00:09:11:07 - 00:09:12:19
So I invite you

00:09:12:19 - 00:09:14:20
to live experiences
like the ones I experienced.

00:09:14:20 - 00:09:16:16
I live in the provinces

00:09:16:16 - 00:09:20:01
taking line fourteen, a camera.

00:09:20:04 - 00:09:22:20
I am not from somewhere else,
but I can have one.

00:09:22:20 - 00:09:25:00
And then there is the space renaissance
and then that

00:09:25:00 - 00:09:28:06
go somewhere, et cetera
How's it going ?

00:09:28:08 - 00:09:29:19
Have I lent my data?

00:09:29:19 - 00:09:32:07
No, it was taken from me.

00:09:32:10 - 00:09:33:21
Was I intentional about that?

00:09:33:21 - 00:09:36:07
No, I didn't volunteer.

00:09:36:07 - 00:09:41:01
So I believe that as a citizen,
we have a need to understand what is

00:09:41:01 - 00:09:44:07
happening
and to be vigilant

00:09:44:10 - 00:09:46:25
and absolute clairvoyance.

00:09:46:25 - 00:09:49:06
In the short time given to me,
I do not have

00:09:49:06 - 00:09:52:19
just one thing to say
what we could already develop.

00:09:52:21 - 00:09:57:01
First thing, read what we wrote,
he spent some time there.

00:09:57:03 - 00:10:00:00
It's still strong
and broadcast what we wrote

00:10:00:00 - 00:10:02:29
on the question of rights
of man in the digital age.

00:10:02:29 - 00:10:08:04
The second thing I should have said
your vigilance, it must be said,

00:10:08:06 - 00:10:10:07
you need to go forward.

00:10:10:07 - 00:10:12:05
We need to change our mentalities.

00:10:12:05 - 00:10:14:08
We are not in a system.

00:10:14:08 - 00:10:16:03
And before

00:10:16:03 - 00:10:20:04
make a major claim,
but to do pedagogy

00:10:20:04 - 00:10:24:06
on these questions,
we are not allowed

00:10:24:09 - 00:10:27:09
to ensure that our data,

00:10:27:12 - 00:10:31:17
finally be taken because it's us,

00:10:31:19 - 00:10:35:01
SO
we can lend them, we can rent them.

00:10:35:04 - 00:10:37:29
There is therefore a logic of contract
which is essential

00:10:37:29 - 00:10:41:02
and that's why I'm a lawyer.

00:10:41:02 - 00:10:50:20
Here I say a few words to you.

00:10:50:23 - 00:10:54:22
So for the lawyer that I am,

00:10:54:25 - 00:10:57:24
I will therefore complete the vision
that we give here

00:10:57:24 - 00:11:02:26
and which was recalled by Emmanuelle
with a vision of law

00:11:02:29 - 00:11:07:25
which is at the heart of the questions
of our Data Alliance.

00:11:07:25 - 00:11:10:10
There is

00:11:10:10 - 00:11:11:14
does the citizen

00:11:11:14 - 00:11:14:14
has real effective control

00:11:14:22 - 00:11:17:09
on his personal data

00:11:17:09 - 00:11:20:02
or not personal?

00:11:20:02 - 00:11:23:16
It is to this question that

00:11:23:19 - 00:11:26:00
we are

00:11:26:00 - 00:11:26:28
attached.

00:11:26:28 - 00:11:29:03
But beyond a simple question

00:11:29:03 - 00:11:33:24
above all to a desire for research
adequate solutions so that it

00:11:33:24 - 00:11:38:20
can have effective control
on this data.

00:11:38:22 - 00:11:40:21
So, control, mastery,

00:11:40:21 - 00:11:45:25
We come to talk
of data ownership

00:11:45:28 - 00:11:46:06
and

00:11:46:06 - 00:11:49:16
paradox
absolute, since the law does not enshrine

00:11:49:16 - 00:11:52:07
no ownership over the data.

00:11:52:07 - 00:11:54:06
With a double paradox

00:11:54:06 - 00:11:57:22
when you structure the data,
you have a database,

00:11:57:25 - 00:12:00:09
you can become an owner

00:12:00:09 - 00:12:02:21
of the database.

00:12:02:21 - 00:12:06:00
Therefore the right of ownership is not
not adapted.

00:12:06:02 - 00:12:08:19
He leaked a little

00:12:08:19 - 00:12:10:00
the question.

00:12:10:00 - 00:12:13:14
Actually,
we have to talk, as Manuel said

00:12:13:16 - 00:12:17:20
Richard, appropriation and

00:12:17:22 - 00:12:19:05
this appropriation.

00:12:19:05 - 00:12:22:01
You can see clearly, the GAFA have siphoned off

00:12:22:01 - 00:12:25:09
our data,
but much closer to us.

00:12:25:10 - 00:12:27:02
The new models
which are set up

00:12:27:02 - 00:12:30:26
for artificial intelligence harvest
our data.

00:12:30:28 - 00:12:34:18
In the middle you have manufacturers
of connected objects.

00:12:34:20 - 00:12:38:29
Well, it's the patent race
since if I have the technology,

00:12:39:01 - 00:12:44:21
if I have the device, well I give it away
which is generated by the device

00:12:44:23 - 00:12:46:14
and at
middle, you have all the applications,

00:12:46:14 - 00:12:50:22
applications
and all the people who say

00:12:50:24 - 00:12:55:01
I collected
the data of which it is mine.

00:12:55:04 - 00:12:57:14
So we experience appropriation

00:12:57:14 - 00:13:00:21
fact of the data

00:13:00:23 - 00:13:02:10
and our our

00:13:02:10 - 00:13:05:22
of our alliance
aims to reverse, to discuss

00:13:05:22 - 00:13:10:23
this idea of appropriation of fact
or to move towards the notion of sharing.

00:13:10:25 - 00:13:12:22
The notion of right of use.

00:13:12:22 - 00:13:19:01
The notion of how to think differently
with another approach.

00:13:19:03 - 00:13:22:26
So, there are some bricks that are
which have been put in place.

00:13:23:04 - 00:13:27:26
You may have heard of Data act
who will give,

00:13:27:28 - 00:13:29:19
who will already oblige

00:13:29:19 - 00:13:33:19
to the manufacturer of connected objects
to a certain transparency

00:13:33:19 - 00:13:37:26
on what data is already
which are generated, which we do not know.

00:13:37:28 - 00:13:40:14
And then give the citizens

00:13:40:14 - 00:13:45:04
a right of access and a right
to be able to reuse this data.

00:13:45:07 - 00:13:47:02
This is a first brick.

00:13:47:02 - 00:13:51:06
After that,
on the business and B-to-B level,

00:13:51:09 - 00:13:53:24
we will set up
sharing agreements

00:13:53:24 - 00:13:56:24
data,
of what we call detached

00:13:57:00 - 00:13:58:15
who are supposed

00:13:58:15 - 00:14:03:13
facilitate the flow of data
to be able to create this economy

00:14:03:16 - 00:14:06:16
of the data which is so necessary,

00:14:06:23 - 00:14:10:24
but at the same time, applying
and that's where we're going to be

00:14:10:25 - 00:14:16:06
extremely vigilant, applying
what they say about loyal, loyal practices,

00:14:16:08 - 00:14:19:19
to know to know

00:14:19:22 - 00:14:22:06
conditions, a contractual framework

00:14:22:06 - 00:14:26:03
who is supposed to be loyal,
fair with fair remuneration.

00:14:26:05 - 00:14:29:20
There we will be watchmen
and guards

00:14:29:21 - 00:14:34:08
on these fair and just practices
and more I would say,

00:14:34:10 - 00:14:38:25
we will go as far as labeling
of these fair and loyal practices.

00:14:38:28 - 00:14:41:29
So, individual property,

00:14:42:01 - 00:14:43:15
collective property.

00:14:43:15 - 00:14:48:13
We can do it together
and there we can share

00:14:48:15 - 00:14:53:01
and theoretically, it can also
not having a property regime.

00:14:53:04 - 00:14:57:15
You have to think about all these things
and in the middle, in the nuances,

00:14:57:18 - 00:15:02:06
you have open data,
open source data, altruism.

00:15:02:08 - 00:15:05:08
This is a reflection that we are launching here

00:15:05:10 - 00:15:09:23
and obviously we address the problem
of the territoriality of data

00:15:09:26 - 00:15:12:01
where it is collected,

00:15:12:01 - 00:15:16:08
where it is processed, where it is stored
with questions of sovereignty.

00:15:16:10 - 00:15:19:03
Who indicated and obviously

00:15:19:03 - 00:15:23:28
the disparity of protection
between one jurisdiction and the other.

00:15:24:01 - 00:15:25:20
Finally,

00:15:25:20 - 00:15:30:11
we will try to be guardians
of a balance

00:15:30:13 - 00:15:33:14
balance between individual property,
property

00:15:33:14 - 00:15:37:15
intellectual, collective property

00:15:37:17 - 00:15:40:15
between commons, between sharing

00:15:40:15 - 00:15:44:19
between entrepreneurship, but above all

00:15:44:22 - 00:15:47:22
for the fundamental rights of citizens.

00:15:47:26 - 00:15:53:00
Thank you. Thanks to Bravo,

00:15:53:03 - 00:15:54:03
THANKS.

00:15:54:03 - 00:15:59:00
Who really introduced the law
access in this

00:15:59:03 - 00:16:00:09
alliance?

00:16:00:09 - 00:16:05:24
So I am a researcher, I teach
at Sciences Po and elsewhere and I work

00:16:05:24 - 00:16:09:11
particularly on new governance
in the digital age.

00:16:09:13 - 00:16:11:24
And my common thread is freedom.

00:16:11:24 - 00:16:16:13
You understand well why I
joins Emmanuel on all these subjects.

00:16:16:16 - 00:16:19:21
For me, I'm going to do an interview
a little about Europe.

00:16:19:24 - 00:16:22:15
In my opinion,
we must gain our independence.

00:16:22:15 - 00:16:26:29
Europe must not be under supervision
and you have to make a real choice

00:16:26:29 - 00:16:32:14
of sovereignty with a possible arsenal
to sanction bad practices,

00:16:32:14 - 00:16:37:06
notably elected officials who may be
to not know.

00:16:37:06 - 00:16:40:01
Me, to be elected today,
you need to know both

00:16:40:01 - 00:16:43:14
geopolitics
and both cyber power.

00:16:43:16 - 00:16:47:00
So maybe it's a little presumptuous
to say that to Inno Police,

00:16:47:03 - 00:16:50:29
but I really believe in the necessity

00:16:51:01 - 00:16:55:11
to put in place
acculturation in the digital age

00:16:55:14 - 00:17:00:07
to allow for reflection
a little global on the new issues.

00:17:00:09 - 00:17:05:25
So if Emmanuel Richard,
we were talking about these issues again

00:17:05:27 - 00:17:09:21
alignment required

00:17:09:23 - 00:17:13:24
on research, on politics,
on the law,

00:17:13:29 - 00:17:18:24
on new governance and how
we will find these new models

00:17:18:27 - 00:17:24:09
which will be appropriate
to these new digital challenges?

00:17:24:11 - 00:17:28:07
Emmanuel
often talks about this transformation,

00:17:28:09 - 00:17:31:05
of this new world that is arriving.

00:17:31:05 - 00:17:35:12
It's this new world
that we must try to build,

00:17:35:12 - 00:17:38:23
to co-construct
and think about new models.

00:17:39:01 - 00:17:42:11
It's also thinking
to the issue of the commons.

00:17:42:13 - 00:17:46:27
We talk about how, like
by Elinor Ostrom,

00:17:46:29 - 00:17:50:28
who put in place lots of principles
and knows how it is

00:17:51:01 - 00:17:55:21
in the digital age,
which in my opinion are new things.

00:17:55:24 - 00:18:01:24
A new way of governing which,
in the digital age, have real meaning.

00:18:01:27 - 00:18:04:19
And last point,

00:18:04:21 - 00:18:05:27
I think that

00:18:05:27 - 00:18:08:27
data is an extension of oneself.

00:18:09:02 - 00:18:13:28
And this data must be heritage

00:18:14:00 - 00:18:18:18
to give the citizen
this voice to deliberate.

00:18:18:18 - 00:18:21:26
May he no longer suffer what we saw.

00:18:21:27 - 00:18:25:29
We suffer,
we undergo this appropriation of data.

00:18:26:02 - 00:18:30:17
Concretely,
we must take control of our destiny

00:18:30:19 - 00:18:35:05
and it is, and it is in my opinion
the ultimate goal.

00:18:35:07 - 00:18:38:27
And one last point
I think the couple State market

00:18:39:00 - 00:18:41:22
must be questioned a little by

00:18:41:22 - 00:18:46:13
the fact of joining
the necessary approach of a State

00:18:46:16 - 00:18:50:04
partner and partner with us
and with you.

00:18:50:06 - 00:18:50:22
Well done !

00:18:50:22 - 00:18:53:22
THANKS. Sabine

00:18:53:26 - 00:18:54:26
Hello, my name is Mathieu

00:18:54:26 - 00:18:58:03
Mercado, I am CEO of a company
called Magma

00:18:58:06 - 00:19:02:07
and I'm here as an entrepreneur
and also a father

00:19:02:09 - 00:19:05:13
because I share the vision of
CEO Forrester that we are

00:19:05:13 - 00:19:09:21
in a civilizational change
and the data is being processed

00:19:09:21 - 00:19:13:25
to become the center of
all economic and social relations

00:19:13:27 - 00:19:16:24
individuals,
companies and states.

00:19:16:24 - 00:19:20:17
In a private setting, we produce
data consistently

00:19:20:17 - 00:19:23:15
when we consume,
when we move,

00:19:23:15 - 00:19:25:07
when we interact with each other.

00:19:25:07 - 00:19:30:22
And this data is often sold
to more or less scrupulous operators

00:19:30:24 - 00:19:34:22
who benefit from a point of view
economic of our, of our data.

00:19:34:25 - 00:19:37:24
So this is already a major problem.
from a private point of view.

00:19:37:24 - 00:19:41:05
And then, from a point of view
economic, we are talking about banking

00:19:41:07 - 00:19:46:03
currency, central bank and that's fine
pose extremely important problems

00:19:46:03 - 00:19:49:27
at the level of fundamental freedoms
since we are going to eliminate

00:19:49:29 - 00:19:54:15
the intermediary that exists between the individual
and monetary production.

00:19:54:18 - 00:19:56:22
Today,
you have a commercial bank

00:19:56:22 - 00:19:59:02
who receive bank statements
and when you consume,

00:19:59:02 - 00:20:01:20
when you buy things,
well it is

00:20:01:20 - 00:20:05:21
you have you accepted that the bank
have access to your statements

00:20:05:23 - 00:20:08:05
and your expenses and your movements?

00:20:08:05 - 00:20:09:25
If we talk
a central bank currency,

00:20:09:25 - 00:20:12:27
we must understand that it is
this central bank which goes directly

00:20:12:27 - 00:20:16:03
have access to your expenses and

00:20:16:06 - 00:20:18:26
it is an eminently political body.

00:20:18:26 - 00:20:23:25
And so we can imagine the excesses
that this can cause.

00:20:23:27 - 00:20:26:12
So today, what I want to defend,

00:20:26:12 - 00:20:30:10
me at DG Forex,
it’s a new mode of governance

00:20:30:10 - 00:20:33:18
data and also
and above all the organization of data.

00:20:33:20 - 00:20:36:07
We live in a world
which is very centralized.

00:20:36:07 - 00:20:40:13
We allowed it to develop
these platforms which centralize data

00:20:40:16 - 00:20:44:13
and who have power
on censorship data, which

00:20:44:19 - 00:20:49:02
which also produce a security risk
since they can be hacked.

00:20:49:02 - 00:20:50:26
We call this a single point of failure,

00:20:50:26 - 00:20:53:12
that's to say
that the data is in one place

00:20:53:12 - 00:20:56:12
and so that can pose
huge problems.

00:20:56:12 - 00:21:01:10
So there are new methods
management of the data that appeared

00:21:01:10 - 00:21:06:13
with blockchain and these are modes
so-called distributed or decentralized.

00:21:06:15 - 00:21:10:23
So decentralization is a bit
the me that I want to defend at the DG.

00:21:11:00 - 00:21:12:22
This is to explain to you.

00:21:12:22 - 00:21:16:06
I am going to give you an example
which you will easily understand.

00:21:16:11 - 00:21:19:11
He is the founder of men

00:21:19:11 - 00:21:22:02
the largest, the second
largest blockchain in the world.

00:21:22:02 - 00:21:24:17
How is
What was his idea to create?

00:21:24:19 - 00:21:25:25
He was a gamer.

00:21:25:25 - 00:21:29:05
He played a lot of video games
and he was playing World of Warcraft.

00:21:29:07 - 00:21:34:19
So it's a video game on which he
played for months or even years.

00:21:34:21 - 00:21:37:24
He had reached the end of the video game
and you know, in video games,

00:21:37:24 - 00:21:39:04
you can buy armor,

00:21:39:04 - 00:21:42:20
you can buy weapons
which give you certain powers

00:21:42:22 - 00:21:45:11
and arrived well
at the end of the game, to caricature,

00:21:45:11 - 00:21:48:22
he bought armor, he's going to bed
and the next day, when he wakes up,

00:21:48:25 - 00:21:52:14
the video game publisher
had removed the powers from his armor.

00:21:52:16 - 00:21:54:18
And then he said to himself But wait,
But it's not possible.

00:21:54:18 - 00:21:57:28
We cannot live in a world
where we did work

00:21:57:28 - 00:21:58:25
who is in the hands

00:21:58:25 - 00:22:01:25
from an editor, who can change the rules
overnight.

00:22:01:27 - 00:22:05:03
And on top of that it gave me maybe lost
since it is located

00:22:05:03 - 00:22:06:12
only in these servers.

00:22:06:12 - 00:22:08:27
So he thought,
he thought about blockchain.

00:22:08:27 - 00:22:11:29
Blockchain
already existed with bitcoin

00:22:12:01 - 00:22:12:19
which is a way

00:22:12:19 - 00:22:16:11
decentralized to issue currency
and so he said to himself we are going to create.

00:22:16:14 - 00:22:19:28
As he therefore created this
decentralized platform where data

00:22:20:00 - 00:22:23:27
is distributed in hundreds, even
thousands of computers, the same data

00:22:24:00 - 00:22:26:29
and we add
a registration layer of encryption

00:22:26:29 - 00:22:29:29
which means that we obviously cannot
have access to this data.

00:22:30:03 - 00:22:33:26
Yes, if we want to look at it,
that we are not allowed to look at it.

00:22:33:28 - 00:22:38:04
So I think that
this method of decentralization

00:22:38:04 - 00:22:41:05
can be applied
with new web tools.

00:22:41:05 - 00:22:44:04
Three Who are the wallets?

00:22:44:04 - 00:22:46:18
Reward systems,

00:22:46:18 - 00:22:49:18
reward for data.

00:22:49:19 - 00:22:52:16
We are in an extremely complex world
where actually

00:22:52:16 - 00:22:53:19
we realize today

00:22:53:19 - 00:22:56:19
that we are faced with challenges such as
data must be shared

00:22:56:26 - 00:23:00:21
and to share the data,
There must be a counterpart.

00:23:00:23 - 00:23:02:25
Effectively,
and this is the goal of DG Forex,

00:23:02:25 - 00:23:04:18
is to know why
I will give my data,

00:23:04:18 - 00:23:09:10
under what conditions and therefore to use
Web technological tools three

00:23:09:12 - 00:23:13:13
and decentralization to allow
for citizens and businesses to share

00:23:13:13 - 00:23:16:27
critical data while retaining
their competitive advantage,

00:23:16:29 - 00:23:19:29
their property rights
and their fundamental freedoms.

00:23:20:04 - 00:23:26:14
There you go, Thank you Mathieu,
You open doors for us

00:23:26:16 - 00:23:28:29
for it to work.

00:23:28:29 - 00:23:32:16
Hello, my name is Emmanuel Olivier,
I am the president of the company

00:23:32:19 - 00:23:36:01
Open Caps, which for the master

00:23:36:01 - 00:23:39:11
of work in the world of real estate,
deals with data governance.

00:23:39:13 - 00:23:44:15
That's why we joined the DG
Forest here because we completely agree

00:23:44:17 - 00:23:46:15
to something that holds us
very much to heart,

00:23:46:15 - 00:23:50:10
which is data governance
and above all by asking the question

00:23:50:10 - 00:23:53:24
which has already been well discussed
so far is to say

00:23:53:24 - 00:23:56:09
is the data
a simple resource

00:23:56:09 - 00:24:00:04
or is the data ultimately
an extension of ourselves?

00:24:00:07 - 00:24:01:20
Of course,

00:24:01:20 - 00:24:04:29
we are convinced that the data,
it is an extension of ourselves.

00:24:04:29 - 00:24:07:18
Thanks to our senses,

00:24:07:20 - 00:24:11:08
we have, we interact with the world,
television.

00:24:11:08 - 00:24:15:05
The first technological developments
allowed us to see further

00:24:15:05 - 00:24:18:17
with television, things
that we would not have seen on a scale

00:24:18:17 - 00:24:23:00
local, to hear more things
who are far away by radio.

00:24:23:03 - 00:24:25:06
And we took a step forward with the Internet.

00:24:25:06 - 00:24:28:12
This time,
we can act further, act further.

00:24:28:12 - 00:24:31:12
We saw it in this period of emptiness.

00:24:31:18 - 00:24:35:13
If we hadn't had these tools
digital, we would have been very bored.

00:24:35:14 - 00:24:37:14
There is still
an economy that was able to continue.

00:24:37:14 - 00:24:41:13
We were able to continue, in any case,
for many, work,

00:24:41:15 - 00:24:44:10
what to take

00:24:44:10 - 00:24:47:10
and go to school
because there were these means.

00:24:47:12 - 00:24:51:06
So it is necessary to take into account
that these technological means

00:24:51:09 - 00:24:52:19
are not just a threat.

00:24:52:19 - 00:24:57:13
When we could think,
when we hear the bad guys

00:24:57:13 - 00:25:01:05
who will fetch the data, et cetera
which should ultimately be deleted.

00:25:01:05 - 00:25:01:26
No not at all.

00:25:01:26 - 00:25:06:27
This is extraordinary progress
that the dematerialization of things,

00:25:06:29 - 00:25:09:19
it allows us to share
the resources

00:25:09:19 - 00:25:13:29
and indeed you have to see it
as positive on one condition

00:25:14:02 - 00:25:17:14
it is that this extension of us
100 of our senses,

00:25:17:16 - 00:25:23:04
indeed, either there is a character
inalienable, the famous

00:25:23:06 - 00:25:26:01
and I can never say it in substance,
reality

00:25:26:01 - 00:25:30:02
that Richard was talking about earlier,
that is to say roughly, it is inseparable.

00:25:30:04 - 00:25:32:18
It's our own data,
it's ourselves.

00:25:32:18 - 00:25:33:17
From the time

00:25:33:17 - 00:25:37:12
where we manipulate data, it is
like we manipulate ourselves.

00:25:37:14 - 00:25:40:06
Present it like this,
obviously, a game-changer.

00:25:40:06 - 00:25:45:17
And so, from the moment
where I can give consent

00:25:45:19 - 00:25:48:28
because I have a first service
in return,

00:25:49:00 - 00:25:52:28
make me geolocate them to find
a restaurant is very practical

00:25:53:01 - 00:25:55:07
and it is a form of remuneration.

00:25:55:07 - 00:25:57:20
But when this remuneration,
Effectively.

00:25:57:20 - 00:26:00:20
And exploitation, that is to say
the transformation

00:26:00:24 - 00:26:04:02
value
monetary or economic of my data,

00:26:04:05 - 00:26:07:25
is completely disproportionate
compared to the feedback I got.

00:26:08:01 - 00:26:13:03
There, we have the right to say no,
I should have the revoke button,

00:26:13:06 - 00:26:17:22
unless you chose to put that
in a in a.

00:26:17:22 - 00:26:18:01
How ?

00:26:18:01 - 00:26:20:25
In the public domain or in
digital commons?

00:26:20:25 - 00:26:24:11
Indeed, where is it different?
since we made this choice

00:26:24:13 - 00:26:27:19
and those who exploit it very much should

00:26:27:22 - 00:26:33:12
actually respect agreements
and therefore this agreement

00:26:33:14 - 00:26:37:09
is not really that clear
when we give our sentence,

00:26:37:11 - 00:26:42:01
our consent and the message
that I wanted to spend today,

00:26:42:01 - 00:26:47:03
like all the members of the
Strong DG here is that on the one hand,

00:26:47:05 - 00:26:50:03
we must not say it is lost.

00:26:50:03 - 00:26:50:20
That's to say ?

00:26:50:20 - 00:26:53:28
We can all, on an individual scale,

00:26:54:01 - 00:26:56:19
influence
for this data recognition.

00:26:56:19 - 00:26:57:24
This is subsidiarity.

00:26:57:24 - 00:27:00:06
This is the second term today
we can.

00:27:00:06 - 00:27:05:16
That is to say that finally, small step,
by small steps, each at work,

00:27:05:18 - 00:27:08:28
as a citizen,
as a father of a family, as said

00:27:09:01 - 00:27:12:05
just now
Mathieu, we can influence little by little.

00:27:12:05 - 00:27:16:23
And DG Foresti has set up

00:27:16:25 - 00:27:20:08
a charter of ten commitments.

00:27:20:08 - 00:27:24:11
And this is what I invite you to do,
you, professionals or citizens,

00:27:24:13 - 00:27:26:04
to spread everywhere.

00:27:26:04 - 00:27:29:15
Because it is this imposition

00:27:29:17 - 00:27:33:07
of this charter
which, little by little, will make its way

00:27:33:10 - 00:27:37:12
and allow us to have
the fair compensation somewhere.

00:27:37:12 - 00:27:39:20
Maintain copyright on our data.

00:27:39:20 - 00:27:43:04
And it is very, very important
that this character be recognized

00:27:43:04 - 00:27:45:22
inalienable of our data

00:27:45:25 - 00:27:48:23
in relation to the use that can be made of it

00:27:48:23 - 00:27:50:10
a commercial enterprise.

00:27:50:10 - 00:27:56:25
So,
I'm just saying that to you.

00:27:56:27 - 00:27:57:06
SO

00:27:57:06 - 00:28:00:23
one last testimony for the road
before giving the floor back to Emmanuel.

00:28:00:23 - 00:28:03:09
And then a time to exchange with you.

00:28:03:09 - 00:28:06:15
So I am here as
as a blockchain entrepreneur today

00:28:06:15 - 00:28:10:25
and as a defrocked banker
and as a defrocked member of the system.

00:28:10:27 - 00:28:14:27
What animates me here,
it’s really this risk, this strength.

00:28:15:00 - 00:28:18:11
Citizen, it's in our title
and it's not for nothing, it's human

00:28:18:11 - 00:28:22:09
in the center and all our friends
of the making of the future here, of its life,

00:28:22:11 - 00:28:26:13
it is the human to whom we give
keys to understanding and action.

00:28:26:13 - 00:28:28:11
This is the strong smart citizen.

00:28:28:11 - 00:28:31:19
It is the human who controls this data
and not the digital solution

00:28:31:19 - 00:28:33:01
who uses the data.

00:28:33:01 - 00:28:34:06
So why this violence?

00:28:34:06 - 00:28:36:19
She is important
because after all, you could tell me

00:28:36:19 - 00:28:39:15
but human progress is obvious
in the cradle

00:28:39:15 - 00:28:41:22
human rights
and the social contract.

00:28:41:22 - 00:28:43:07
Well no, not at all.

00:28:43:07 - 00:28:47:20
Building a digital society
for an enlightened human

00:28:47:22 - 00:28:50:12
is not at all
at the heart of our social contract, my friends.

00:28:50:12 - 00:28:53:12
Quite simply because preservation
of human progress,

00:28:53:13 - 00:28:56:28
that's not his goal and that's fine
that’s why we did a rereading.

00:28:57:01 - 00:29:00:28
Let us understand that this contract
social and economic which has been thought

00:29:00:28 - 00:29:05:12
at a time in the 18th century
before science fiction, had

00:29:05:12 - 00:29:09:16
no need to wonder about the essence
of humanity in relation to the machine.

00:29:09:19 - 00:29:12:27
How, in fact, in the 18th century,
borrow their clothes

00:29:12:27 - 00:29:14:21
and their shoes in an instant?

00:29:14:21 - 00:29:17:02
Could you say to yourself
than a language model

00:29:17:02 - 00:29:21:03
called artificial intelligence
may one day himself

00:29:21:05 - 00:29:24:23
generate a program capable
to find your phone number,

00:29:24:26 - 00:29:29:26
to call you, record your voice,
to find your bank, to call it

00:29:29:26 - 00:29:33:29
to reset your password,
then liquidate your account

00:29:34:01 - 00:29:37:01
or liquidate yourself
if connected to autonomous weapons.

00:29:37:04 - 00:29:40:03
Talk to the soldiers, we're there.

00:29:40:03 - 00:29:43:11
So, at that time, in the 18th,
we are light years away

00:29:43:11 - 00:29:45:19
and that's a play on words.

00:29:45:19 - 00:29:47:16
This was possible at that time.

00:29:47:16 - 00:29:48:13
What's the important thing?

00:29:48:13 - 00:29:50:20
The important thing,
it's the conquest of territories

00:29:50:20 - 00:29:52:02
and the leadership of the people.

00:29:52:02 - 00:29:55:05
This reminds me of something today
and that at the time, it's true,

00:29:55:06 - 00:29:57:07
this is already a big progress
since we pass

00:29:57:07 - 00:29:59:10
of governance by the gods
and the power of blood.

00:29:59:10 - 00:30:01:11
I hardly caricature a governance

00:30:01:11 - 00:30:05:16
by representative institutions,
political and monetary in principle,

00:30:05:18 - 00:30:07:29
but it remains governance
by the structures.

00:30:07:29 - 00:30:10:29
And if we keep this mindset, let's be sure

00:30:11:04 - 00:30:16:10
that it is ideal for machine systems
and the and the steering machines.

00:30:16:12 - 00:30:17:17
So if we don't change

00:30:17:17 - 00:30:21:16
our way of thinking about the social contract,
if there is not this rewrite

00:30:21:19 - 00:30:26:10
digital programs are already
and they will be systematically

00:30:26:13 - 00:30:29:22
in the service of the conquest of power
and the leadership of peoples.

00:30:29:24 - 00:30:31:09
But at what cost to humanity?

00:30:31:09 - 00:30:34:09
We're really starting to feel it
and measure it.

00:30:34:16 - 00:30:36:28
So it's true that we can be tempted
to put our current problems

00:30:36:28 - 00:30:38:28
on the backs of the states and we do a lot

00:30:38:28 - 00:30:42:23
finance, bankers, elites,
journalists, GAFA.

00:30:42:23 - 00:30:44:16
But frankly,
It’s true that it’s tempting.

00:30:44:16 - 00:30:47:16
But let's still be aware
that we all, each one,

00:30:47:22 - 00:30:50:22
we submit to a social contract
and economic obsolete

00:30:50:23 - 00:30:53:23
who does not place
the preservation of humanity at the center.

00:30:53:23 - 00:30:55:20
It's simple, it's basic.

00:30:55:20 - 00:30:57:20
So that's why we started
two years ago.

00:30:57:20 - 00:30:58:17
We told you.

00:30:58:17 - 00:31:01:09
This reread
of human rights of 1789

00:31:01:09 - 00:31:04:07
with the eyes of the 21st century,
that’s really what we did.

00:31:04:07 - 00:31:05:23
And that’s why the data charter

00:31:05:23 - 00:31:07:17
of which we will have the opportunity
to talk to you too,

00:31:07:17 - 00:31:09:28
and all digital solutions
through which

00:31:09:28 - 00:31:12:28
we work in all our different
very operational projects,

00:31:13:02 - 00:31:16:03
they are put into service,
but in the first degree, truly at the service

00:31:16:03 - 00:31:17:09
of an enlightened human

00:31:17:09 - 00:31:20:09
and not in the construction department
of the perfect steering machine.

00:31:20:14 - 00:31:22:14
In fact, it is an assumed bias

00:31:22:14 - 00:31:25:01
and it totally changes
the way of thinking, of programming

00:31:25:01 - 00:31:27:10
digital infrastructure
and build our projects.

00:31:27:10 - 00:31:29:06
We could go into detail with you

00:31:29:06 - 00:31:32:06
and it's a 180 turn,
we say it often.

00:31:32:10 - 00:31:35:05
So for us, digital
of a human future, and this is important,

00:31:35:05 - 00:31:37:21
it's not digital
of a transhumant future.

00:31:37:21 - 00:31:39:19
Let's really pay attention to this.

00:31:39:19 - 00:31:41:23
And so this human future is possible.

00:31:41:23 - 00:31:47:21
We are at work and we wish
move forward together with you.

00:31:47:23 - 00:31:51:04
Thank you so much.

00:31:51:07 - 00:31:52:15
Thank you so much.

00:31:52:15 - 00:31:56:00
It was moving and intense.

00:31:56:03 - 00:32:00:13
I remember this rewrite
rights

00:32:00:16 - 00:32:04:10
and responsibilities
of human beings in the digital age.

00:32:04:12 - 00:32:07:18
The three fundamental principles
that we wrote

00:32:07:20 - 00:32:12:05
principles of human consubstantiality
and data,

00:32:12:08 - 00:32:16:14
the principle of subjectivity
in the decision-making process.

00:32:16:16 - 00:32:19:00
I just want to add there, it concerns,

00:32:19:00 - 00:32:21:28
among other things, algorithms.

00:32:21:28 - 00:32:24:17
We want transparency
algorithms

00:32:24:17 - 00:32:28:08
and know intentionality
algorithms

00:32:28:10 - 00:32:31:14
and the principle of subsidiarity
of data governance.

00:32:31:15 - 00:32:35:07
Emmanuel spoke about it,
It’s Stephanie too.

00:32:35:07 - 00:32:41:05
It's actually a 180 degree change
of the digital decision-making process

00:32:41:05 - 00:32:45:15
as it is today and revealed
Effectively.

00:32:45:15 - 00:32:48:14
Where are we?
We are not here only to

00:32:48:14 - 00:32:54:09
enlighten or, as you say,
It's Richard, it's interesting.

00:32:54:11 - 00:32:57:11
We want
challenge, but we also want to act.

00:32:57:18 - 00:33:00:08
And so we said to each other collegially

00:33:00:08 - 00:33:05:03
what could we suggest?
to all the actors in fact,

00:33:05:05 - 00:33:08:05
to humanity to make it safe

00:33:08:10 - 00:33:12:21
that their data is well used

00:33:12:23 - 00:33:17:11
and also ensure
that when their data is used,

00:33:17:11 - 00:33:21:18
there is reversibility
and indeed a charter of trust.

00:33:21:20 - 00:33:23:25
So, this charter of trust,

00:33:23:25 - 00:33:26:21
it is not set in stone
like writing.

00:33:26:21 - 00:33:30:11
It's just a start,
but I reveal it to you here.

00:33:30:13 - 00:33:34:28
These are the ten commitments that we have that we
presented here in preview.

00:33:35:00 - 00:33:38:00
Data for you data for you you

00:33:38:04 - 00:33:41:11
and Data for you Usage
because we are there,

00:33:41:13 - 00:33:45:15
as Stephanie said,
or as Katy said,

00:33:45:18 - 00:33:49:00
we are also entering the era of use
In the society of uses

00:33:49:02 - 00:33:54:24
which completes the company
of property, we do not oppose them.

00:33:54:27 - 00:33:59:15
With 10.10 commitments
which I am not going to comment on here,

00:33:59:17 - 00:34:03:22
but that you will find well, well well

00:34:03:25 - 00:34:05:27
explained on our stand.

00:34:05:27 - 00:34:08:04
And of course on our website.

00:34:08:04 - 00:34:11:14
But there's still a little time left
and what I propose to you,

00:34:11:16 - 00:34:13:00
it is the word.

00:34:13:00 - 00:34:16:05
Are we here to talk?
with you, exchange with you?

00:34:16:09 - 00:34:21:23
I'm sure you have questions

00:34:21:26 - 00:34:25:16
to Jerome.

00:34:25:18 - 00:34:26:29
So we listen to you

00:34:26:29 - 00:34:31:19
or you mention the fact that we are

00:34:31:21 - 00:34:34:17
scam somewhere
in the use of our data

00:34:34:17 - 00:34:36:02
because it is our property.

00:34:36:02 - 00:34:40:00
Or that we put at risk that a

00:34:40:00 - 00:34:43:20
certain morality that you describe
in relation to the goods she gave.

00:34:43:23 - 00:34:46:29
Me, what I heard,
when these kinds of topics came up

00:34:47:02 - 00:34:50:18
for several years, some
American companies who said

00:34:50:21 - 00:34:56:06
But if you have nothing to reproach yourself for,
what is your problem, sir?

00:34:56:08 - 00:34:56:21
So I

00:34:56:21 - 00:35:01:28
asks the question outside of the aspects
morals and intellectuals that you mention,

00:35:02:00 - 00:35:05:11
why focus on the problem?

00:35:05:13 - 00:35:07:23
And I answer

00:35:07:23 - 00:35:10:20
already it's a question

00:35:10:20 - 00:35:11:02
THANKS.

00:35:11:02 - 00:35:14:23
Who will be the censor to know if it is
a problem or not?

00:35:14:25 - 00:35:16:00
Who will decide?

00:35:16:00 - 00:35:20:22
We see in particular
in these proprietary platforms

00:35:20:22 - 00:35:25:01
we see in tact and in demat
above all we see

00:35:25:04 - 00:35:30:12
these censor issues
content, for example platforms

00:35:30:14 - 00:35:33:10
and the thought that I have
since my common thread is

00:35:33:10 - 00:35:36:20
freedom, that's who is going to say

00:35:36:22 - 00:35:40:27
who will judge our content,
who will, who will be,

00:35:40:29 - 00:35:43:15
who will be the censor

00:35:43:15 - 00:35:46:09
of these contents
who are often in social networks,

00:35:46:09 - 00:35:49:00
political content, content?

00:35:49:00 - 00:35:52:23
So this is who
which is shaping society.

00:35:52:25 - 00:35:57:19
And I think that our association
and our think tank and all the more

00:35:57:19 - 00:36:03:14
since once again, we want to pass
truly operational, with a vocation

00:36:03:14 - 00:36:09:20
just thinking about questions
here also philosophically

00:36:09:22 - 00:36:12:03
Good. I want to say one thing here,
what you say,

00:36:12:03 - 00:36:15:22
It’s true, that’s the argument.
which is constantly used,

00:36:15:24 - 00:36:18:01
used by those who understood

00:36:18:01 - 00:36:21:20
that with the data,
they will make money

00:36:21:22 - 00:36:25:12
with the data,
they will keep the power

00:36:25:15 - 00:36:27:17
and we say a minister of the interior.

00:36:27:17 - 00:36:31:05
But wait for the Olympics,
we are going to put facial recognition.

00:36:31:07 - 00:36:34:28
But if you have nothing to bring together,
why ultimately do you oppose

00:36:34:28 - 00:36:36:17
to that ?

00:36:36:17 - 00:36:38:29
Bullshit!

00:36:38:29 - 00:36:41:11
In this case, the issue,
it's freedom.

00:36:41:11 - 00:36:45:06
The issue is that once again,
we are in a democracy

00:36:45:06 - 00:36:47:22
who, at times,
can be a democrat.

00:36:47:22 - 00:36:51:07
But the day it will be a dictatorship
and fascism is everywhere

00:36:51:10 - 00:36:54:10
in our country it may not be far

00:36:54:14 - 00:36:57:15
When these tools are in place,
what will happen?

00:36:57:22 - 00:37:01:06
I have children,
I have grandchildren.

00:37:01:08 - 00:37:04:02
And let's look again
if we put in place infrastructure,

00:37:04:02 - 00:37:08:17
ways of thinking and finally
the dissemination of this type of slogan.

00:37:08:20 - 00:37:13:01
But if you have nothing to reproach yourself for,
we can do everything ?

00:37:13:04 - 00:37:16:11
No, First, if my interlocutor,

00:37:16:14 - 00:37:19:12
he tells me that and who makes money,
I said OK,

00:37:19:12 - 00:37:22:11
so we share the money

00:37:22:11 - 00:37:23:19
and we get organized.

00:37:23:19 - 00:37:27:09
And if the other,
which is the control body, says

00:37:27:09 - 00:37:30:23
I want to control everything at this moment,
I say I'm leaving the ball.

00:37:30:26 - 00:37:33:26
So we are looking
an intermediate way,

00:37:33:28 - 00:37:36:28
essential, practical
where this kind of slogan

00:37:36:29 - 00:37:39:29
must be destroyed.

00:37:40:07 - 00:37:41:26
I just wanted to add

00:37:41:26 - 00:37:45:07
that we have nothing to reproach ourselves for.

00:37:45:10 - 00:37:48:05
Our subject is the enlightened man.

00:37:48:05 - 00:37:52:01
In the era of information overcapacity,

00:37:52:04 - 00:37:54:21
we are no longer enlightened.

00:37:54:21 - 00:37:55:26
You have enlightened yourself.

00:37:55:26 - 00:37:58:00
It is precisely necessary

00:37:58:02 - 00:38:00:01
come together and have a project.

00:38:00:01 - 00:38:01:28
It is not that simple.

00:38:01:28 - 00:38:04:08
In the digital age,

00:38:04:08 - 00:38:08:21
we can't exactly say
we compete with a machine.

00:38:08:23 - 00:38:10:03
It's pretty clear.

00:38:10:03 - 00:38:13:01
So from there, the enlightened man
from yesterday

00:38:13:01 - 00:38:16:10
and the man of Enlightenment is not
not at all the same enlightened man.

00:38:16:10 - 00:38:17:25
Today.

00:38:17:25 - 00:38:19:15
Example

00:38:19:15 - 00:38:21:10
we talked a lot about GPT.

00:38:21:10 - 00:38:22:24
Everyone here says we're great.

00:38:22:24 - 00:38:29:01
Protect our personal data,
it's good, you have to record,

00:38:29:04 - 00:38:30:19
but we are no longer here.

00:38:30:19 - 00:38:33:19
We are finally, because of the redistricting
other industrial data

00:38:33:19 - 00:38:37:01
which allow perfectly
an identity of the person.

00:38:37:04 - 00:38:39:10
Does the enlightened man know this?

00:38:39:10 - 00:38:42:06
No, we don't even know at all
what data they generated

00:38:42:06 - 00:38:44:12
and we don't even know
what treatment is being done.

00:38:44:12 - 00:38:48:26
So that's Data's job.
Alliance also is to enlighten.

00:38:48:28 - 00:38:49:28
THANKS.

00:38:49:28 - 00:38:52:20
Data is like love,
It’s the intention that counts.

00:38:52:20 - 00:38:55:11
And this question, in fact, is what is
the objective and what is the intention?

00:38:55:11 - 00:38:56:05
Actually,

00:38:56:05 - 00:38:59:17
I come back to the social contract
by retaining today implicitly

00:38:59:19 - 00:39:00:08
the intention.

00:39:00:08 - 00:39:05:04
It's still conquering peoples
which are still in sense and controlled.

00:39:05:04 - 00:39:05:24
Populations.

00:39:05:24 - 00:39:08:16
From this point of view,
the one who actually says,

00:39:08:16 - 00:39:10:09
we don't care, we control the data,
we control.

00:39:10:09 - 00:39:11:02
He is completely right.

00:39:11:02 - 00:39:13:28
It's more efficient than you
you would no longer own anything.

00:39:13:28 - 00:39:16:25
You will be happy
because the implicit intention

00:39:16:25 - 00:39:20:10
remains the conquest
and population control.

00:39:20:13 - 00:39:23:11
If we change the intention
and that we put it in human progress,

00:39:23:11 - 00:39:24:29
it's not explicit
in the social contract.

00:39:24:29 - 00:39:26:26
Even in 1789, it must be reread.

00:39:26:26 - 00:39:29:25
And so, if we change the intention,
we are going to change the whole system.

00:39:29:25 - 00:39:32:24
And me, when we ask this question
over here, I put myself in their heads.

00:39:32:24 - 00:39:36:07
But it's much more effective
to conquer.

00:39:36:10 - 00:39:39:22
To be in an imperialist logic,
it's super clear.

00:39:39:25 - 00:39:42:19
There you go, if logic is human,
That’s what we need to re-examine.

00:39:42:19 - 00:39:46:16
And me, I would be more like to say I am my
kif, it's more the enlightened human.

00:39:46:18 - 00:39:51:07
But if 99% of us estimate that we have
nothing to do and it's killing each other.

00:39:51:10 - 00:39:52:20
Good end.

00:39:52:20 - 00:39:54:02
It is this implicitness which is painful.

00:39:54:02 - 00:39:55:07
And those who need this question.

00:39:55:07 - 00:39:57:20
There, I'm embarrassed, that's because it's
what is their intention?

00:39:57:20 - 00:39:59:20
I don't even know if he was aware.

00:39:59:20 - 00:40:02:06
But thank you again Fanny.

00:40:02:06 - 00:40:06:29
I gave the floor to Henry Peyret.

00:40:07:01 - 00:40:09:15
Hello and congratulations for the presentation.

00:40:09:15 - 00:40:14:11
I have a little problem
all the same, it is in fact an individual

00:40:14:13 - 00:40:16:15
we generate data.

00:40:16:15 - 00:40:20:18
But they cannot only
that belong to us

00:40:20:18 - 00:40:28:01
and on which we can have

00:40:28:04 - 00:40:30:19
closer.

00:40:30:19 - 00:40:34:01
You actually talked to yourself
mainly common areas,

00:40:34:04 - 00:40:37:24
but to be able to establish the commons,
it is necessary that a certain number of organisms,

00:40:37:26 - 00:40:42:12
particularly states, can use these
data regardless of your will.

00:40:42:15 - 00:40:42:29
All right.

00:40:42:29 - 00:40:45:29
Otherwise, we will not be able to define
a number of common ones there.

00:40:46:06 - 00:40:48:25
It comes back to what you're saying.
Stephanie who is.

00:40:48:25 - 00:40:50:03
Ah yes, but be careful,

00:40:50:03 - 00:40:52:15
what is the intention
behind this government

00:40:52:15 - 00:40:56:01
to be able to use the data
to make it a service that is useful to us

00:40:56:01 - 00:40:59:14
useful to us,
individually and collectively?

00:40:59:16 - 00:41:01:03
One should understand. Do.

00:41:01:03 - 00:41:05:27
Today we have to have
a revolution of collectives,

00:41:06:00 - 00:41:08:13
The collectives that brought us together

00:41:08:13 - 00:41:11:13
for hundreds of years have been and

00:41:11:13 - 00:41:14:09
are mutating.

00:41:14:09 - 00:41:17:16
The family with blended families,
a certain number of things

00:41:17:16 - 00:41:21:07
like that or even families
chosen, is in the process of mutating.

00:41:21:10 - 00:41:24:22
It's no longer blood family
as we had for a while.

00:41:24:22 - 00:41:27:22
This is one of the circles
on which we will be able to apply

00:41:27:24 - 00:41:31:00
a word that I will use,
but who is who is

00:41:31:02 - 00:41:32:18
I'm going to twist the word.

00:41:32:18 - 00:41:34:25
This is the notion of sovereignty.

00:41:34:25 - 00:41:38:07
We bash you all the time
telling you we have to keep

00:41:38:07 - 00:41:42:16
sovereignty and therefore we will keep
individual data sovereignty.

00:41:42:19 - 00:41:47:10
Yes, but sovereignty,
it applies to a circle of the country.

00:41:47:13 - 00:41:48:28
Okay, but now let's imagine

00:41:48:28 - 00:41:51:28
that we use sovereignty
to have the family circle.

00:41:52:03 - 00:41:54:06
Who is the ruler of the family?

00:41:54:06 - 00:41:55:20
For a very long time, he was the dad,

00:41:55:20 - 00:41:59:16
father and father
was the master of this circle.

00:41:59:18 - 00:42:04:19
When family circles,
the circle, regions,

00:42:04:22 - 00:42:06:10
the circle, country,

00:42:06:10 - 00:42:08:27
country is changing,
many of our children

00:42:08:27 - 00:42:13:29
will not be able to fight for
us, to defend the country's border.

00:42:14:02 - 00:42:16:20
And then we are now going towards
other circles which are the continent.

00:42:16:20 - 00:42:17:16
And above all, a circle

00:42:17:16 - 00:42:22:01
that we took into account
very strongly on the void that was

00:42:22:04 - 00:42:23:13
Earth.

00:42:23:13 - 00:42:25:28
It's a new circle, it's
a circle on which, in fact,

00:42:25:28 - 00:42:29:13
we will have to share data
regardless of your will.

00:42:29:15 - 00:42:33:06
If we want to resolve
the problems of the earth.

00:42:33:09 - 00:42:34:16
Thank you Henri.

00:42:34:16 - 00:42:37:15
So just for when you take
speak at length,

00:42:37:15 - 00:42:40:28
you have to come here
because otherwise we won't see you.

00:42:41:01 - 00:42:42:00
It's completely okay.

00:42:42:00 - 00:42:45:22
And on the other hand, these subjects,
I invite you to follow

00:42:45:25 - 00:42:49:15
all conferences
which will take place today and tomorrow.

00:42:49:17 - 00:42:53:09
But this after 12 p.m. in particular,
we are going to talk about blockchain, redesign

00:42:53:09 - 00:42:54:10
territories.

00:42:54:10 - 00:42:56:22
But in fact, we go beyond the redesign
territories.

00:42:56:22 - 00:43:02:15
This raises the question of governance
of our society and organizations

00:43:02:17 - 00:43:06:22
who who will
effectively govern our society.

00:43:06:24 - 00:43:11:07
We are indeed at dawn
of a real change in society

00:43:11:09 - 00:43:16:03
and it is not actually by stalling,
by modeling our organizational system

00:43:16:03 - 00:43:19:01
that we will actually answer
to these issues.

00:43:19:01 - 00:43:22:19
Of course,
all these circles are actually

00:43:22:21 - 00:43:25:27
very important and that's it too
the strength of digital.

00:43:25:29 - 00:43:26:19
It is that in

00:43:26:19 - 00:43:31:06
we are going to have governance on a scale
variable, with variable circles

00:43:31:06 - 00:43:36:10
according to the criticality rate
or the rate of or the level of.

00:43:36:13 - 00:43:40:10
Besides, it's emotions or whatever.

00:43:40:13 - 00:43:41:17
And that’s strength.

00:43:41:17 - 00:43:45:24
So in fact, we see that we cannot
effectively stall such an organization

00:43:45:24 - 00:43:51:03
that it is, which is frozen with a world
who actually and

00:43:51:05 - 00:43:53:13
interactions which are variable.

00:43:53:13 - 00:43:56:04
But thank you, it was very clear,
it's very marked

00:43:56:04 - 00:43:59:28
and it's just fine
introduce the debate on the whole

00:43:59:28 - 00:44:02:28
conferences that come
since we are going to talk today,

00:44:03:03 - 00:44:06:14
right after, scanning
buildings, then

00:44:06:14 - 00:44:10:17
new professions and there we will begin
to talk about trusted third parties.

00:44:10:19 - 00:44:14:00
Then, we will talk about the commons
like a digital.

00:44:14:03 - 00:44:17:20
We're going to talk about how, about AI,
of the blockchain, redesign

00:44:17:20 - 00:44:22:22
territories, democracy
in the digital age

00:44:22:25 - 00:44:24:20
and in the economy,

00:44:24:20 - 00:44:27:29
economic model, economic model
since economic models

00:44:27:29 - 00:44:29:04
are changing.

00:44:29:04 - 00:44:32:28
But actually, we are not
not there either to affirm things.

00:44:33:01 - 00:44:37:03
We are here to question ourselves together,
To ask questions,

00:44:37:06 - 00:44:40:16
find draft answers
or food for thought.

00:44:40:18 - 00:44:43:25
This is the purpose of these two days.

00:44:43:27 - 00:44:46:27
Are there any other questions?
or comments?

00:44:46:29 - 00:44:49:26
Richard, you want to bring us
a small remark?

00:44:49:26 - 00:44:54:07
You were talking about architectural change,

00:44:54:10 - 00:44:56:26
architectural specialist
and change.

00:44:56:26 - 00:45:01:11
It's Stephanie,
it's your hobby and that's it

00:45:01:14 - 00:45:05:03
what you want to say
in relation to their brand anyway.

00:45:05:05 - 00:45:06:10
I think we have to be vigilant.

00:45:06:10 - 00:45:08:08
Effectively,
on this question of collective.

00:45:08:08 - 00:45:11:08
It is essential.

00:45:11:13 - 00:45:15:08
Because in a moment of questioning,
of change and precariousness

00:45:15:08 - 00:45:18:29
perceived by all, it is
because the collective lives together.

00:45:19:01 - 00:45:22:28
So that's the answer, it's proximity.

00:45:23:01 - 00:45:27:19
I think we still have to say to ourselves
that today, and that's the challenge

00:45:27:19 - 00:45:31:06
that we try to do together
and with you, join.

00:45:31:08 - 00:45:32:02
We are at the beginning.

00:45:32:02 - 00:45:35:06
Finally, we try,
try a police force, incubators,

00:45:35:06 - 00:45:41:24
of this transformation,
of this necessary bifurcation.

00:45:41:26 - 00:45:44:22
We thought in
Anthropocene, anthropocene, what is it?

00:45:44:22 - 00:45:50:18
It’s the impact of humans
on the life of society, of the earth.

00:45:50:21 - 00:45:54:10
And in fact,
we are in the stage capital

00:45:54:12 - 00:45:56:01
and I think he

00:45:56:01 - 00:46:00:08
you still have to have this lucidity
once again,

00:46:00:10 - 00:46:03:10
in which we must think about the other.

00:46:03:18 - 00:46:06:18
As part of the Foundation
and transition to go see that.

00:46:06:24 - 00:46:11:18
And we worked with a program that
is called Stop Energy Exclusion.

00:46:11:24 - 00:46:15:25
On the question of the end,
to create solidarity,

00:46:15:25 - 00:46:19:27
I think the word too
in our data business

00:46:19:27 - 00:46:24:03
also shared, it is a lever of
the new solidarity to be invented.

00:46:24:05 - 00:46:27:22
I think he
There must be values that we share.

00:46:27:25 - 00:46:30:25
And these values, we will try to

00:46:31:02 - 00:46:33:22
are those inherited

00:46:33:24 - 00:46:36:06
of the Declaration of Human Rights
and the citizen.

00:46:36:06 - 00:46:39:06
This is still what leads us.

00:46:39:09 - 00:46:40:13
Thanks Richard.

00:46:40:13 - 00:46:44:25
Just to come back
on the principle of subsidiarity

00:46:44:27 - 00:46:47:17
and architecture.

00:46:47:17 - 00:46:49:08
I'm asking the question

00:46:49:08 - 00:46:51:25
on digital architecture

00:46:51:25 - 00:46:53:20
because in fact today,

00:46:53:20 - 00:46:56:20
if we talk about digital, it's
because there is a digital architecture

00:46:56:24 - 00:46:59:22
and we see that this structure,
it is very centralized.

00:46:59:22 - 00:47:01:24
We talked. Matthew

00:47:01:24 - 00:47:03:26
I ask you
the question is what we can't have

00:47:03:26 - 00:47:06:15
a decentralized architecture that speaks?

00:47:06:15 - 00:47:10:26
Since you talked about circles,
who speaks about us and who extends

00:47:10:28 - 00:47:15:09
neuronally and not
not actually in a pyramidal way?

00:47:15:11 - 00:47:17:03
It's fundamental.

00:47:17:03 - 00:47:20:13
And from the moment we understood
that, indeed, we are going to say to ourselves

00:47:20:13 - 00:47:22:19
if we start from the smallest
towards the greatest,

00:47:22:19 - 00:47:26:13
we are going to rethink the organization
of our society and we will inevitably

00:47:26:13 - 00:47:32:01
necessarily logically happen
to new social organizations

00:47:32:04 - 00:47:37:15
to manage this confidentiality,
this data sovereignty.

00:47:37:17 - 00:47:39:26
In 1789,

00:47:39:28 - 00:47:40:27
it was what ?

00:47:40:27 - 00:47:43:01
Was it the end of the third estate?

00:47:43:01 - 00:47:45:04
Is today,
Exactly, we are not born?

00:47:45:04 - 00:47:49:01
There is no need to rethink
of such an organization

00:47:49:03 - 00:47:50:23
around data governance?

00:47:50:23 - 00:47:52:01
It's a question.

00:47:52:01 - 00:47:56:16
But I think this question will
be asked during these two days.

00:47:56:19 - 00:48:00:09
Is what could be the common,
precisely, it is this infrastructure,

00:48:00:11 - 00:48:03:00
is to say
we are moving towards a new infrastructure

00:48:03:00 - 00:48:08:26
data and this infrastructure
must belong to the supplier,

00:48:08:29 - 00:48:11:22
to the people who produce this data,
to the producer of this data.

00:48:11:22 - 00:48:12:28
So that’s what’s common.

00:48:12:28 - 00:48:17:03
In fact, it is this infrastructure
which will be able to generate externalities

00:48:17:06 - 00:48:21:18
positive, benefits in all
gender or economic benefits.

00:48:21:24 - 00:48:25:01
Well this infrastructure
we can make it our own

00:48:25:03 - 00:48:28:16
and I think it's the capacity
commons to coexist

00:48:28:16 - 00:48:34:28
with established forms
and don't let don't let the state

00:48:35:01 - 00:48:37:20
alone, but, but also

00:48:37:20 - 00:48:41:14
where companies
and above all work again with the State.

00:48:41:14 - 00:48:49:18
It is a new governance in the sense

00:48:49:21 - 00:48:54:27
practical,

00:48:54:29 - 00:48:55:29
saved by the bell.

00:48:55:29 - 00:48:57:26
THANKS

00:48:57:26 - 00:48:59:21
And first of all, well done.

00:48:59:21 - 00:49:02:08
No, no, it's not going to be long, it's
a question.

00:49:02:08 - 00:49:04:10
Bravo for your initiative.

00:49:04:10 - 00:49:08:15
I still wonder
on the immensity of the efforts to be made

00:49:08:17 - 00:49:11:17
to go where we are going
which seems quite simple to me.

00:49:11:20 - 00:49:15:02
Completely legitimate and necessary.

00:49:15:04 - 00:49:17:15
On the question I ask myself

00:49:17:15 - 00:49:20:13
Are there no sectors?
of activity

00:49:20:13 - 00:49:23:04
in the national economy
or international

00:49:23:04 - 00:49:27:08
who have more influence than others
and on which it would be necessary

00:49:27:11 - 00:49:30:11
work as a priority to

00:49:30:16 - 00:49:33:00
To advance,
give the maximum chance of success?

00:49:33:00 - 00:49:37:27
I have the beginnings of an answer,
but I would like to have your opinion

00:49:37:29 - 00:49:39:12
from the entire sector

00:49:39:12 - 00:49:42:20
related to real estate,
to the living environment, to the territory.

00:49:42:22 - 00:49:46:02
I think. This is my my experience

00:49:46:05 - 00:49:47:22
and a bit of my analysis of the thing.

00:49:47:22 - 00:49:52:27
This sector generates a lot of data
and it is at the user level.

00:49:52:29 - 00:49:56:12
There are plenty of professionals
involved in all these sectors.

00:49:56:14 - 00:49:58:23
If I had to recommend

00:49:58:23 - 00:50:03:01
a sector on which to influence
primarily to train others.

00:50:03:01 - 00:50:05:12
For me, it would be on what I call

00:50:05:12 - 00:50:09:02
real estate, living environment, territory
and so on.

00:50:09:09 - 00:50:10:21
THANKS.

00:50:10:21 - 00:50:12:13
Thanks Patrick. You are completely right.

00:50:12:13 - 00:50:17:11
In any case, this approach
it must be accompanied by verticals

00:50:17:13 - 00:50:20:10
and we must indeed
find areas so that we can

00:50:20:10 - 00:50:24:04
raise awareness among all citizens.

00:50:24:07 - 00:50:26:08
DG Forestry is just making efforts.

00:50:26:08 - 00:50:28:09
Here is multi-sectoral.

00:50:28:09 - 00:50:31:20
We have several sectors
energy and water, no

00:50:31:20 - 00:50:35:14
energy and carbon,
because it is vital.

00:50:35:16 - 00:50:37:16
And you will see, tomorrow,

00:50:37:16 - 00:50:40:16
we're going to talk about local communities
energy

00:50:40:20 - 00:50:43:22
and there, it’s really concrete
and we will see that it cannot work

00:50:43:29 - 00:50:46:29
that if there is governance
energy data.

00:50:47:01 - 00:50:52:10
Otherwise it will be the citizen consumer
energy will be left behind

00:50:52:13 - 00:50:55:25
and he will not be involved
in this local loop.

00:50:55:27 - 00:50:57:19
Same for water.

00:50:57:19 - 00:50:59:19
The same goes for buildings and spaces.

00:50:59:19 - 00:51:00:19
You are completely right.

00:51:00:19 - 00:51:04:15
Buildings and spaces,
it is indeed a sector.

00:51:04:17 - 00:51:07:16
There are also the territories,
cities and towns.

00:51:07:16 - 00:51:11:03
We talked,
I was talking about the number of data.

00:51:11:10 - 00:51:15:05
Today is health in the world
which transmits the most data.

00:51:15:05 - 00:51:18:03
This is 50% of the data
that are processed around the world.

00:51:18:03 - 00:51:21:25
And health, indeed
is one of our priorities.

00:51:21:28 - 00:51:23:03
Mobility too.

00:51:23:03 - 00:51:26:03
We clearly see that today,
as an elected official in my municipality

00:51:26:09 - 00:51:29:16
and also elected to the metropolis, as a result,

00:51:29:19 - 00:51:33:01
we can clearly see that we have a real problem
access to mobility data

00:51:33:08 - 00:51:35:04
and that we cannot have
public policy

00:51:35:04 - 00:51:38:27
in terms of mobility
if we don't have access to this data.

00:51:39:00 - 00:51:40:23
So actually, you're completely right.

00:51:40:23 - 00:51:44:06
It is indeed necessary to anchor
our speech on the ground

00:51:44:08 - 00:51:48:03
and daily life, topics
who are adults.

00:51:48:05 - 00:51:51:05
Emmanuelle very on the question
which is very interesting,

00:51:51:06 - 00:51:53:10
which is more of an observation
just one question

00:51:53:10 - 00:51:56:13
to say we have a lot of data
in the real estate sector,

00:51:56:13 - 00:51:57:24
in the territorial sector
and justly,

00:51:57:24 - 00:52:01:25
we are almost in the situation
reverse this data.

00:52:01:25 - 00:52:05:12
And if the fragmented T is not exploited.

00:52:05:14 - 00:52:08:05
And there we clearly see the importance

00:52:08:05 - 00:52:11:18
to regain control of governance
of the data and that finally,

00:52:11:18 - 00:52:15:15
in all that could be said, it is
a smart contract problem.

00:52:15:15 - 00:52:20:00
We talk about it, we have eminent
representatives of the law here,

00:52:20:01 - 00:52:22:29
but not representatives of the law.
We'll talk about it better than me.

00:52:22:29 - 00:52:23:27
But finally,

00:52:23:27 - 00:52:27:24
when I said earlier
we must actually mention these innuendoes.

00:52:27:27 - 00:52:29:14
There needs to be clarity in the contract.

00:52:29:14 - 00:52:31:26
I gave consent,
but on what exactly?

00:52:31:26 - 00:52:35:07
And in the world of construction,
which is excessively fragmented,

00:52:35:07 - 00:52:38:07
this is one of the focuses that we
to us on Open four.

00:52:38:08 - 00:52:42:08
Indeed, how do we find
data sharing that will allow

00:52:42:10 - 00:52:45:03
to generate value on the one hand,
and especially

00:52:45:03 - 00:52:48:11
sharing things
and to involve all stakeholders.

00:52:48:13 - 00:52:51:13
And this collective sharing,

00:52:51:13 - 00:52:54:16
it is essential
and today we don't have the tools.

00:52:54:16 - 00:52:57:23
So we are exactly in the cases
reverse of some examples

00:52:57:26 - 00:53:00:16
which are given on the bad GAFA.

00:53:00:19 - 00:53:02:27
I think it's not like that
that it must be presented.

00:53:02:27 - 00:53:07:23
But there are people who knew how to exploit
things more or less without our knowledge.

00:53:07:25 - 00:53:09:27
There are other sectors
who, on the contrary, are late.

00:53:09:27 - 00:53:11:13
Let's try to rebalance

00:53:11:13 - 00:53:13:29
and recreate value
where for the moment, it is not.

00:53:13:29 - 00:53:18:10
Because precisely on
may be too protectionist a reflex.

00:53:18:12 - 00:53:21:28
Finally, we have exactly the opposite
of what we wanted.

00:53:22:00 - 00:53:25:00
We have a fragmentation
and not the use of knowledge

00:53:25:00 - 00:53:28:09
important. Thanks again.

00:53:28:11 - 00:53:31:11
And see you at the next conference

00:53:31:12 - 00:53:34:29
which will follow immediately.

00:53:35:02 - 00:53:36:26
Thank you really.

00:53:36:26 - 00:53:41:00
On mastery,

00:53:41:03 - 00:53:43:24
on digitalization
buildings and cities,

00:53:43:24 - 00:53:46:23
a prerequisite for their transition
environmental.

00:53:46:23 - 00:53:49:23
So we answer well
to the question from Patrick Pontier.

00:53:49:27 - 00:53:52:19
Here we go

00:53:52:22 - 00:53:52:27
has.

bottom of page