robinbloke: (It's in there somewhere)
[personal profile] robinbloke
Information Overload

Take a planet. E. Let this planet be our subjective model for this concept. Or indeed any habital medium.
Take a population of this planet. P. Let this value be the population of this planet. This value is effectively unlimited and will increase at standard reproduction rates up and including the span of the population leaving said planet and colonizing some or multiple others.
Let I be the number of pieces of information artificially generated and traceable about each individual of the species, normalised across the population such that those of higher information storage rating are lowered against those that have little, or indeed, none. Note that insofar as discoveries and general knowledge are concerned these will be referenced and counted as part of that individuals information.
Furthermore to this let D be the amount of said population who are deceased and yet still have information retained about them. Again Normalised.
To this let C be the amount of this information that can be considered 'repeated' as happens, and can be optimised, deleted or removed; thus giving the single scalar measure for technology, information storage, this is a factor against 1 whereby 1 indicates no compression and 0 indicates (impossible) compression.
Let M be the total amount of consolidated referenceable memory available to the population for information storage.

The proposal is thus;

At a point M will be exceeded when technology fails to keep up to the demands of information storage i.e. at

M = ( (P+D) * I) / C.

Is worldmind, as we know it, is living on borrowed time?

Date: 2008-08-23 12:42 am (UTC)
ext_8559: Cartoon me  (Default)
From: [identity profile] the-magician.livejournal.com
One of the things the internet hasn't got (yet) is proper aging of data ... websites sit there looking as young as they did when they were written in 1997 but having been updated in over a decade ... any decent system of entropy would not rely on ISPs going out of business to take out vast amounts of geocities pages etc. ...

... there's a lot of stuff about a person's life that stops being interesting after 50 or 100 years (or certainly 100 years after their death) and so an algorithm can check to see what data is considered significant, and whether anyone has ever accessed any data on this person, and at some point they can be converted into an instance of "another 23rd century person, much like any other that lived in megopolis, and so you remove redundant data and just keep significant events/differences ... and at some time in the future there stops being anything/much significant about insignificant people ...

or something :-)

Date: 2008-08-23 01:09 am (UTC)
From: [identity profile] robinbloke.livejournal.com
There you are adjusting I and C, but still the hard limit of M remains...

Date: 2008-08-23 07:58 am (UTC)
ext_8559: Cartoon me  (Default)
From: [identity profile] the-magician.livejournal.com
Don't see why it's a hard limit ... infinite number of parallel universes, stick one byte in each, viola!

And why "consolidated"? I'm not saying "not consolidated", I'm asking why that's a requirement of M that needs to be considered.

And for that matter, what's wrong with lossy compression and entropy as I posited? The Worldmind doesn't need to know everything about everybody and until there's instanteous comms across the entire universe, a lot of information that arrives will be so out of date as to not need to be stored.

But on the other hand, I don't really care :-)

Off to breakfast at Discworldcon now ...!

Date: 2008-08-23 10:38 am (UTC)
From: [identity profile] robinbloke.livejournal.com
Ah but with inifinite univeries you end up with infinite population; if you can populate them with data, why not people?

It's a consideration into the point whereby the collective computer memory of the world 'breaks' and can no longer handle the requirements of the population.

Have a great time at Pratchett-con :)

Date: 2008-08-23 08:00 am (UTC)
ext_8559: Cartoon me  (Default)
From: [identity profile] the-magician.livejournal.com
Oh, and P has limits too, at some point if the expansion keeps increasing, the entire surface is covered 15 deep in people having sex, having babies and eating the people around them or something ... :-)
(deleted comment)
(deleted comment)

Date: 2008-08-23 02:40 am (UTC)
From: [identity profile] robinbloke.livejournal.com
YEEESSS!!!

Date: 2008-08-23 11:18 am (UTC)
zotz: (Default)
From: [personal profile] zotz
P does not behave as you describe. It is currently in the process of stabilising, and after it does so (probably in about 50 years) is expected to go into slow longterm decline. The length of that latter trend, of course, is unpredicted.

Date: 2008-08-26 08:15 am (UTC)
From: [identity profile] robinbloke.livejournal.com
Not in context of our world at present, no; this is a more simplistic formula that doesn't take into account limits of expansion; although P could be interpreted as the number of population that are required to have data stored about them, then this could keep P growing until M breaks.

Date: 2008-08-25 01:41 pm (UTC)
From: [identity profile] texassky.livejournal.com
P will never stabilize!
The parameters for p, as set forth by the penguin, are that P follows standard trends. Certain segments of the world population do not believe in limitations to reproduction, and current trends, in most of the world, are that even with limited reproduction 2 adults produce 2 children. Those children produce children faster than D comes about, and D does produce its own I. Not to mention that the practice of limiting P produces I in and of itself.

Bill Gates: "640K ought to be enough for anybody."

Obviously Mr. Gates had not reviewed Rob's theory.

Profile

robinbloke: (Default)
robinbloke

January 2016

S M T W T F S
     12
3456789
10111213141516
17181920212223
24 252627282930
31      

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 25th, 2026 01:16 pm
Powered by Dreamwidth Studios