Friday, August 15, 2014

Can we make web apps less reliant on network?

So I was thinking a little about how I was using bootstrap for some of my sites, and with it being a pretty commonly used code across sites it got me thinking.

Could we somehow share these resources across domains reducing the need to download, which causes us to slow things down normally.

Here is my thought.

When parsing a .html file you come across a script tag with a src a la

script src=""

When it downloads the file it (maybe?) puts it in memory so it can be used (I could be wrong here).

What I was thinking is that we maybe generate a key value store, and it does something like this:

{"", "pointer to memory that this resides"}

Now instead of simply downloading a .js file, it looks at the lookup (or hash table) and finds that that EXACT path file exists in memory and just uses that instead of downloading the file again.


1. I don't know if it would be possible to modify that memory.
2. I wouldn't expect this caching to be indefinite, but should be do-able across pages during an active session.

As long as the downloaded data isn't capable of being modified in memory, then doing a string lookup in a local hash *should* give us a decent startup decrease because we don't need to re-download the file(s) and I can't see how we would have any problems with safety/security since you're literally using the exact same file as you downloaded earlier, you just don't need to re-download.

I'm sure I'm missing something here, but I would think that this should be possible. Got any ideas? hit me up on twitter (@onaclov2000) or leave a comment here