#lrucache
Explore tagged Tumblr posts
Text
オブジェクト破棄
int cacheSize = 4 * 1024 * 1024; // LruCahe bitmapCache = new LruCache(cacheSize) { protected int sizeOf(String key, bitmap value) { return value.getByteCount(); } }
synchronized(cache) { if (cache.get(key) == null) { cache.put(key, value); } }
0 notes
Text
w2d5
What we did:
Finally the weekend. Again the main focus today was on algorithms, that pertain to performance and optimization. Another long lecture on the ins and outs of hash mapping and LRU (least recently used) caching. We went over how data sets are created and how they can be optimized using specific data sets for specific types of projects.
One of the more interesting aspects of hash mapping is it’s relatively quick access to data, compared to “regular” data searching techniques. Do you ever wonder why sometimes you pull up your app on your phone and it instantly loads stuff (like your feed, a map, or pictures)? And other times it takes a bit of time to load some content from the internet? It’s mainly because of this mechanism called caching, which keeps a tiny bit of memory on your local device for quick access. It’s the same reason why you can load up google maps even if you’re online sometimes. Your phone remembers a small bit of it for quick and easy access without having to request information from a server.
Enter LRU caching. Least recently used caching is a method to remember content that the user uses and discards from caching anything the user does not use. Say for example you open up your facebook feed from your phone and load the first 5 posts. As you scroll downwards, your feed at the top is forgotten from the caching and the memory is used to remember the new posts you’re loading from the bottom. It’s essentially first in last out. Now say you go back up your feed and onto content that you’ve already seen. The post you’re currently on becomes the current most used post and committed to caching while the posts further below becomes older lower used content and thus put in the back of the queue until it’s either discarded or refreshed.
Topics where I struggled:
Must take the practice assessment again!
Felt a lot more improved:
Understanding the higher level concepts. I was a bit tired this morning but I was still excited to learn about LRU caching and hash maps. In addition, I feel like I’m starting to mesh very well with my fellow cohort friends. I can tell we’re going to have a lot of fun together and I definitely hope i get through this program.
Code that == Mind Blown!
https://en.wikipedia.org/wiki/Cache_algorithms
How I’m feeling:
Even though I’m tired and in need for massive amounts of sleep, I’m excited to get crackin this weekend!
8/10 motivation
Checkout my projects!
https://github.com/theRealYoshi
I’m taking suggestions!
https://yoshihiroluk.typeform.com/to/ilJBBy
0 notes
Text
32. LRU Cache
Design and implement a data structure for Least Recently Used (LRU) cache. It should support the following operations: get and set.
get(key) - Get the value (will always be positive) of the key if the key exists in the cache, otherwise return -1. set(key, value) - Set or insert the value if the key is not already present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item.
LRU cache is very important in cache design. The most intuitive way of designing a LRU cache data structure is by using Linked List because adding / removing at head are very efficient for Linked List. To provide a quick access for the Key you need a hash map to store <Key, LinkListNode>.
You can read my code here.
0 notes