-
Notifications
You must be signed in to change notification settings - Fork 141
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Caching Abstraction #211
Caching Abstraction #211
Changes from 6 commits
ae66da7
90ea034
137bfd6
ba8b64b
bfcb9a1
3a4eafb
2eefde8
4fb6b8c
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,219 @@ | ||
import { List, Map, OrderedSet, Record } from 'immutable' | ||
|
||
export const CacheEntry = Record({ | ||
value: null, | ||
storeStates: Map(), | ||
dispatchId: null, | ||
}) | ||
|
||
/******************************************************************************* | ||
* interface PersistentCache { | ||
* has(item) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why are There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
The reason why there seems to be extra operations is because nuclear is designed to be used with a persistent data structure for the cache. For a caching abstraction to encapsulate something like LRU, the cache has to record when an item is read from cache. Since I meant to add a comment in the code, but this interface is modeled directly after the caching protocol in clojure.core.cache, which is also for immutable data caches There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Got it. I figured the reason was to keep the cache construct immutable, IMO would be simpler for developers to allow mutability and go with There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. My initial approach had this, but changing from a immutable to mutable cache had negative implications (e.g. when reactor evaluates using previous state causing cache churn) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I see, makes sense |
||
* lookup(item, notFoundValue) | ||
* hit(item) | ||
* miss(item, entry) | ||
* evict(item) | ||
* asMap() | ||
* } | ||
*******************************************************************************/ | ||
|
||
/** | ||
* Plain map-based cache | ||
*/ | ||
export class BasicCache { | ||
|
||
/** | ||
* @param {Immutable.Map} cache | ||
*/ | ||
constructor(cache = Map()) { | ||
this.cache = cache; | ||
} | ||
|
||
/** | ||
* Retrieve the associated value, if it exists in this cache, otherwise | ||
* returns notFoundValue (or undefined if not provided) | ||
* @param {Object} item | ||
* @param {Object?} notFoundValue | ||
* @return {CacheEntry?} | ||
*/ | ||
lookup(item, notFoundValue) { | ||
return this.cache.get(item, notFoundValue) | ||
} | ||
|
||
/** | ||
* Checks if this cache contains an associated value | ||
* @param {Object} item | ||
* @return {boolean} | ||
*/ | ||
has(item) { | ||
return this.cache.has(item) | ||
} | ||
|
||
/** | ||
* Return cached items as map | ||
* @return {Immutable.Map} | ||
*/ | ||
asMap() { | ||
return this.cache | ||
} | ||
|
||
/** | ||
* Updates this cache when it is determined to contain the associated value | ||
* @param {Object} item | ||
* @return {BasicCache} | ||
*/ | ||
hit(item) { | ||
return this; | ||
} | ||
|
||
/** | ||
* Updates this cache when it is determined to **not** contain the associated value | ||
* @param {Object} item | ||
* @param {CacheEntry} entry | ||
* @return {BasicCache} | ||
*/ | ||
miss(item, entry) { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'd have expected this to be called There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Agreed it's awkward/jarring at first, but hopefully the reply above clears up where "hit" and "miss" come from. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yeah API is a bit strange, but is a proven API in a functional context (clojure) which I really like. |
||
return new BasicCache( | ||
this.cache.update(item, existingEntry => { | ||
if (existingEntry && existingEntry.dispatchId > entry.dispatchId) { | ||
throw new Error("Refusing to cache older value") | ||
} | ||
return entry | ||
}) | ||
) | ||
} | ||
|
||
/** | ||
* Removes entry from cache | ||
* @param {Object} item | ||
* @return {BasicCache} | ||
*/ | ||
evict(item) { | ||
return new BasicCache(this.cache.remove(item)) | ||
} | ||
} | ||
|
||
const DEFAULT_LRU_LIMIT = 1000 | ||
const DEFAULT_LRU_EVICT_COUNT = 1 | ||
|
||
/** | ||
* Implements caching strategy that evicts least-recently-used items in cache | ||
* when an item is being added to a cache that has reached a configured size | ||
* limit. | ||
*/ | ||
export class LRUCache { | ||
|
||
constructor(limit = DEFAULT_LRU_LIMIT, evictCount = DEFAULT_LRU_EVICT_COUNT, cache = new BasicCache(), lru = OrderedSet()) { | ||
this.limit = limit; | ||
this.evictCount = evictCount | ||
this.cache = cache; | ||
this.lru = lru; | ||
} | ||
|
||
/** | ||
* Retrieve the associated value, if it exists in this cache, otherwise | ||
* returns notFoundValue (or undefined if not provided) | ||
* @param {Object} item | ||
* @param {Object?} notFoundValue | ||
* @return {CacheEntry} | ||
*/ | ||
lookup(item, notFoundValue) { | ||
return this.cache.lookup(item, notFoundValue) | ||
} | ||
|
||
/** | ||
* Checks if this cache contains an associated value | ||
* @param {Object} item | ||
* @return {boolean} | ||
*/ | ||
has(item) { | ||
return this.cache.has(item) | ||
} | ||
|
||
/** | ||
* Return cached items as map | ||
* @return {Immutable.Map} | ||
*/ | ||
asMap() { | ||
return this.cache.asMap() | ||
} | ||
|
||
/** | ||
* Updates this cache when it is determined to contain the associated value | ||
* @param {Object} item | ||
* @return {LRUCache} | ||
*/ | ||
hit(item) { | ||
if (!this.cache.has(item)) { | ||
return this; | ||
} | ||
|
||
// remove it first to reorder in lru OrderedSet | ||
return new LRUCache(this.limit, this.evictCount, this.cache, this.lru.remove(item).add(item)) | ||
} | ||
|
||
/** | ||
* Updates this cache when it is determined to **not** contain the associated value | ||
* If cache has reached size limit, the LRU item is evicted. | ||
* @param {Object} item | ||
* @param {CacheEntry} entry | ||
* @return {LRUCache} | ||
*/ | ||
miss(item, entry) { | ||
if (this.lru.size >= this.limit) { | ||
if (this.has(item)) { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. why bother with this case? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's reasonable that you successfully lookup something from the cache, but you have your own notion of whether it's stale and want to write a new value back. We do this currently with There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I was more wondering why worry about whether or not the cache has the value -- we could just evict LRU values and add the item. But I guess that might lead to having |
||
return new LRUCache( | ||
this.limit, | ||
this.evictCount, | ||
this.cache.miss(item, entry), | ||
this.lru.remove(item).add(item) | ||
) | ||
} | ||
|
||
const cache = (this.lru | ||
.take(this.evictCount) | ||
.reduce((c, evictItem) => c.evict(evictItem), this.cache) | ||
.miss(item, entry)); | ||
|
||
return new LRUCache( | ||
this.limit, | ||
this.evictCount, | ||
cache, | ||
this.lru.skip(this.evictCount).add(item) | ||
) | ||
} else { | ||
return new LRUCache( | ||
this.limit, | ||
this.evictCount, | ||
this.cache.miss(item, entry), | ||
this.lru.add(item) | ||
) | ||
} | ||
} | ||
|
||
/** | ||
* Removes entry from cache | ||
* @param {Object} item | ||
* @return {LRUCache} | ||
*/ | ||
evict(item) { | ||
if (!this.cache.has(item)) { | ||
return this; | ||
} | ||
|
||
return new LRUCache( | ||
this.limit, | ||
this.evictCount, | ||
this.cache.evict(item), | ||
this.lru.remove(item) | ||
) | ||
} | ||
} | ||
|
||
/** | ||
* Returns default cache strategy | ||
* @return {BasicCache} | ||
*/ | ||
export function DefaultCache() { | ||
return new BasicCache() | ||
} |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,6 @@ | ||
import Immutable from 'immutable' | ||
import logging from '../logging' | ||
import { CacheEntry } from './cache' | ||
import { isImmutableValue } from '../immutable-helpers' | ||
import { toImmutable } from '../immutable-helpers' | ||
import { fromKeyPath, getStoreDeps, getComputeFn, getDeps, isGetter } from '../getter' | ||
|
@@ -330,22 +331,21 @@ export function evaluate(reactorState, keyPathOrGetter) { | |
} | ||
|
||
// Must be a Getter | ||
// if the value is cached for this dispatch cycle, return the cached value | ||
if (isCached(reactorState, keyPathOrGetter)) { | ||
// Cache hit | ||
return evaluateResult( | ||
getCachedValue(reactorState, keyPathOrGetter), | ||
reactorState | ||
) | ||
} | ||
|
||
// evaluate dependencies | ||
const args = getDeps(keyPathOrGetter).map(dep => evaluate(reactorState, dep).result) | ||
const evaluatedValue = getComputeFn(keyPathOrGetter).apply(null, args) | ||
const cache = reactorState.get('cache') | ||
var cacheEntry = cache.lookup(keyPathOrGetter) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. do you think cacheLookup here should also handle the resolution of There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If the cache is a hit on a Maybe I am missing something here, but it seems like this is wrong in a lot of cases. // store state 1
var result1 = reactor.evaluate(getter)
reactor.dispatch('action')
// store state 2
var results2 = reactor.evaluate(getter)
// it would seem that since the reactor cache has an entry for `getter` than it will always be a hit Is there a test case to verify this? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Ah, good catch. This should be const isCacheMiss = !cacheEntry || isDirtyCacheEntry(reactorState, cacheEntry)
if (isCacheMiss) {
// ...
}
// ... I'm surprised this isn't covered by a test case already. Will add one. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
The caches are generalized right now and don't have nuclear specific logic. I felt like this was still the right place for that logic There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @jordangarcia Fixed w/ 4fb6b8c |
||
const isCacheMiss = !cacheEntry | ||
if (isCacheMiss || isDirtyCacheEntry(reactorState, cacheEntry)) { | ||
cacheEntry = createCacheEntry(reactorState, keyPathOrGetter) | ||
} | ||
|
||
return evaluateResult( | ||
evaluatedValue, | ||
cacheValue(reactorState, keyPathOrGetter, evaluatedValue) | ||
cacheEntry.get('value'), | ||
reactorState.update('cache', cache => { | ||
return isCacheMiss ? | ||
cache.miss(keyPathOrGetter, cacheEntry) : | ||
cache.hit(keyPathOrGetter) | ||
}) | ||
) | ||
} | ||
|
||
|
@@ -375,57 +375,31 @@ export function resetDirtyStores(reactorState) { | |
return reactorState.set('dirtyStores', Immutable.Set()) | ||
} | ||
|
||
/** | ||
* Currently cache keys are always getters by reference | ||
* @param {Getter} getter | ||
* @return {Getter} | ||
*/ | ||
function getCacheKey(getter) { | ||
return getter | ||
} | ||
|
||
/** | ||
* @param {ReactorState} reactorState | ||
* @param {Getter|KeyPath} keyPathOrGetter | ||
* @return {Immutable.Map} | ||
* @param {CacheEntry} cacheEntry | ||
* @return {boolean} | ||
*/ | ||
function getCacheEntry(reactorState, keyPathOrGetter) { | ||
const key = getCacheKey(keyPathOrGetter) | ||
return reactorState.getIn(['cache', key]) | ||
} | ||
function isDirtyCacheEntry(reactorState, cacheEntry) { | ||
const storeStates = cacheEntry.get('storeStates') | ||
|
||
/** | ||
* @param {ReactorState} reactorState | ||
* @param {Getter} getter | ||
* @return {Boolean} | ||
*/ | ||
function isCached(reactorState, keyPathOrGetter) { | ||
const entry = getCacheEntry(reactorState, keyPathOrGetter) | ||
if (!entry) { | ||
return false | ||
} | ||
|
||
const storeStates = entry.get('storeStates') | ||
if (storeStates.size === 0) { | ||
// if there are no store states for this entry then it was never cached before | ||
return false | ||
} | ||
|
||
return storeStates.every((stateId, storeId) => { | ||
return reactorState.getIn(['storeStates', storeId]) === stateId | ||
// if there are no store states for this entry then it was never cached before | ||
return !storeStates.size || storeStates.some((stateId, storeId) => { | ||
return reactorState.getIn(['storeStates', storeId]) !== stateId | ||
}) | ||
} | ||
|
||
/** | ||
* Caches the value of a getter given state, getter, args, value | ||
* Evaluates getter for given reactorState and returns CacheEntry | ||
* @param {ReactorState} reactorState | ||
* @param {Getter} getter | ||
* @param {*} value | ||
* @return {ReactorState} | ||
* @return {CacheEntry} | ||
*/ | ||
function cacheValue(reactorState, getter, value) { | ||
const cacheKey = getCacheKey(getter) | ||
const dispatchId = reactorState.get('dispatchId') | ||
function createCacheEntry(reactorState, getter) { | ||
// evaluate dependencies | ||
const args = getDeps(getter).map(dep => evaluate(reactorState, dep).result) | ||
const value = getComputeFn(getter).apply(null, args) | ||
|
||
const storeDeps = getStoreDeps(getter) | ||
const storeStates = toImmutable({}).withMutations(map => { | ||
storeDeps.forEach(storeId => { | ||
|
@@ -434,19 +408,11 @@ function cacheValue(reactorState, getter, value) { | |
}) | ||
}) | ||
|
||
return reactorState.setIn(['cache', cacheKey], Immutable.Map({ | ||
return CacheEntry({ | ||
value: value, | ||
storeStates: storeStates, | ||
dispatchId: dispatchId, | ||
})) | ||
} | ||
|
||
/** | ||
* Pulls out the cached value for a getter | ||
*/ | ||
function getCachedValue(reactorState, getter) { | ||
const key = getCacheKey(getter) | ||
return reactorState.getIn(['cache', key, 'value']) | ||
dispatchId: reactorState.get('dispatchId'), | ||
}) | ||
} | ||
|
||
/** | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do you envision using LRU cache? Bundled with nuclear? If so how is it exposed to end user?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As an opt-in configuration change that is passed when Reactor is created.
Probably punt that to @jordangarcia for best way to expose
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm thinking probably exposing both the DefaultCache and LRUCache on the
Nuclear
object, and allowing one to be passed into creating a new reactor.