Click or drag to resize
Infosoft Logo


The reason for having a cache mechanism is almost obvious. We want to better the overall performance, avoiding costly queries is a simple way of doing this. Caching should essentially be done everywhere based on usage patterns, caching on the client (browser), caching on the web front end (usually what you as the reader is developing), on the web services (what this document is about) and on the database level. In short any and all tiers have different options and responsibilities for doing caching.

The InfoSystems webservices currently contains two cache varieties

  1. The Business Configuration Cache

  2. The Result/Query Cache

Each is described in detail, but in short the Business Configuration Cache cache data returned from the database for various semi-static business data, and the Query Cache caches data based on operations and parameters in the webservice methods.

Business Configuration Caching

The InfoSystems application suite relies on a rather large amount of business configuration data. I.e. configuration settings that determine how the system works and under which parameters it should operate. This is data such as the available Titles, the products on a given title, what each product costs in a given country/municipality or what parameters are used for contacting a payment service provider.

Each piece of data is not that expensive to query from the database, but because it is done all the time, and because in total there is a lot of it, most of it is cached in memory during application intitialisation. This in-memory collection is what we in this document denote as the Business Configuration Cache.

Consequences of the configuration cache?

All of the InfoSystems modules uses constantly and feeds this cache with data occasionally, and at the time of writing the cache is local to a given instance. So if a client adds a new product it won't automatically show in the list of products on the webservice server. This is rather important because it has an effect on how the application works. For instance if a new product is added, and an end user immediately orders a new subscription with that product new product code, the order is likely to fail because the webservice is not aware of the newly added product yet.

Most often new business configurations are added days or weeks in advance of them being used for real life business cases. In which case the web server will have recycled and rebuilt its cache with the new data. However in some cases it is an issue and the only way to solve it is force a recycle of the service.

The default recylce interval depends on the configuration of the webserver, but often it happens at least once a day, and by default several times a day.

Result/Query Caching on select service methods

The cache mechanism utilised for various selected web service methods is a simple query result cache. That is, given a query operation and a known input it will return the last result. Given a query and an unknown input it will run the query, store the result in cache and return the result. We will get back to a concrete example later in the post.

In terms of the web services the method name is the query operation and the value of the method parameters is the input.

For instance the method CalculatePrice is the query operation and the Title, Currency, Zip and Product (the rest is excluded for brevity) are all part of the operation input.

Unlike the Business Configuration Cache, the query cache lies on "top" of some web service methods and is not an integral part of the application suite and as such will not affect the internal logic of the various web service methods.

How it works

Normally a method request is a calculation and/or a query against the database. The actual method execution, and a subsequent serialization (and transport involving marshaling and such, but lets ignore that for now). So for any and all requests the path would look something like the following:

Receive request + Execute request(Costly) + Serialization

By introducing a cache we change the picture, like so:

Receive request + Cache lookup(Cheap) + Execute request + Serialization

However once info has been added to cache the path becomes

Receive request + Cache lookup + serialization

So the costly step of executing the request has been removed. However nothing is free, so the cost of this is increased memory use, and possibly added complexity. The memory we cannot do anything about, but the complexity should be negligible due to the choice of methods that we have applied caching on.

Which methods and why

To reduce issue related to stale caching and minimize issues in general we have opted to only apply the caching mechanism on select methods. In essence these methods are the ones with rarely changing data like product codes, prices etcetera. In essence semi-static data, that in any event requires that web service host is recycled when updates occur.

At the time of writing we have no clear indicator/documentation about the list of methods, we are working on chaning that.

Performance and Example

As promised here is a concrete example with performance numbers included to show the difference.

Calculate Price with a fixed zip code and product
With cache:
  5000 requests average request time 9,46 ms (roughly 1min 24s in total)
Without cache:
  5000 requests average request time 85,11 ms (roughly 2min 39s in total)

Even though the example is rather contrived, it does represent the difference in execution time quite well. Given two similar requests (its not unlikely to have two product requests in the same zip code within a days work), the second request will be served at a speed of roughly 10 ms, compared to the normal 90ms.

Cache expiration and eviction

So far we have not talked about cache expiration and eviction. Mainly because there isn’t any. The only way to expire the cache is to recycle the web service, in which case the entire cache is expired, and we have provided no direct means of evicting specific cache items. This might be included in the future, but for now you will have to live without it.


Configuring the web service cache is relatively easy, the caching is actually enabled by default, but you can disable it by changing the web.config file.

Cache Settings of Web.config file
    <!-- If set to true the cache is disabled -->
    <setting name="DisableCache" serializeAs="String">
    <!-- If set to true logging of cache hits/misses will be disabled, this has no effect if caching is disabled -->
    <setting name="DisableCacheLogging" serializeAs="String">