Memory

Many people is not aware about this, but programs (apps for the young) are loaded into a computer’s memory the moment you start them. Sometimes earlier when you turn on your computer. Computer memory is a limited and volatile resource. Second level memory was invented so that your apps can also store permanent data, such as configuration data, input and output, etc. Second level memory was materialized in the form of hard disks, slower than the original memory, but reliable enough for permanent storage. People needed to have ways to find stored stuff without having to memorize addresses. Structured organization was then conceived in the form of objects: files and folders, using a physical-world analogy.

The introduction of the personal computer (PC) turned computers from number crunching machines into digital production tools. Production tools were gradually removed in favor of the computer for text editing, video editing, graphics, record keeping, etc. Eventually this phenomena increased the demand for larger permanent storage capacity. Every new computer model should advertise at least a 2x increase in it’s disk capacity to caught market’s eye. Worldwide Internet adoption put the pressure few levels up. Current models will never get you satisfaction as the latest and greatest promises. It’s an addiction. The more the better.

Little time has been put on re-thinking this clearly unsustainable model. Do you really need to have the largest disk in the world on your personal computer? I find that keeping files that are, and most likely will be, available forever in the Internet makes no sense. We should really ask ourselves this question: How much memory we want to keep? My take on such question is this:

  • Personal storage: For personal storage. Objects such as your personal and family photos and videos, documents or anything you produce and most likely don’t share. Digital objects that you want to keep with you all the time. Backup for this is critical and should be done the way we do it currently: full and incremental.
  • Shared storage: For digital objects you produce and share with the world. For objects that are available on the Internet but you want to keep a copy, and eventually sync-up every now and then with the source. Backup for this should be basically metadata for object’s source and relevant information plus full backup for specific objects. Say you are a scientist working on some data-set that you’ve grabbed from a public source.
  • Internet storage: For anything that is available on the Internet. Such objects are copied to your computer but not kept there, since they will be, most likely, always available online. Software installers,  development components, video and audio streams (including torrent’s), etc. are candidates for this area. Those objects basically live in the Internet and sometimes you want to use them. If you want to keep a copy, mark them as “shared” to have them on your “shared storage”. Backup for this is not necessary, you can simply keep metadata or URIs at application or filesystem level.

Memory is becoming larger and faster but access and retrieval is still constrained by time. My answer to the initial question came when asking myself if the “download” model is still valid today. Computers were designed under the copy model. To work with data you need to copy from disk to main memory or the other way. The moment you visit a website a copy of that is stored on your browser’s assigned memory. Does this model still makes sense in the Internet and always-on world? I think the model should be upgraded.