Recently I have been working on a project, which involved communication with a lot of external systems. These systems were mainly web services and databases. During the development phase of the project I only had access to a test version of these systems, i.e. dealing with test data. One of these external systems was of a greater interest as it was designed to serve the main data the customer dealed with. It was a traditional SOAP web service (build on ASP.NET). The service contains a bunch of methods to retrieve and update data. Everything seems pretty normal so far. However, a few of these methods that fetch data happened to work very slow. By working slow I mean, that the external server took its time to do the necessary computations. The calls to these methods were sometimes so slow that users had to see a waiting screen for 10 minutes. And this is pretty annoying. And the customer realized it. The first solution was, of course, to ask the customer if it was possible to optimize this service. Unfortunately, the service itself is developed and maintained by another company and it could take some time until a better version was released. We didn’t have this time, so we had to work ourselves on this issue.
Possible solutions
In the following I try to explain historically how we managed to solve this issue.
Solution 1: Use Cache
Caching is a traditional technique applied in such situations. As we developed the project on ASP.NET (MVC), we relied on ASP.NET cache. A cache is a simple key-value store, preserved for the whole application’s lifecycle. One can add items to it, retrieve them later by a key, and either remove them manually or let the ASP.NET process make them expire. The following code demonstrates the usage:
1 2 3 4 5 6 7 8 9 10 11 12 |
public IEnumerable GetItemsById(int id) { string cacheKey = "GetItemsById_" + id; IEnumerable items = _cache[cacheKey] as IEnumerable; if (items == null) { items = _client.GetItemsById(id); _cache.Insert(cacheKey, items, null, DateTime.Now.AddHours(5)); } return items; } |
In the above code-snippet _cache is an instance of System.Web.Caching.Cache andĀ _client is an instance of our service client. This first solution has the following advantages and disadvantages:
Pros
- Only one request to our website in a certain amount of time goes directly to the service, every subsequent request is taken from the cache
Cons
- For a certain amount of time we show possibly old data
- Once in a while the user will experience again a long waiting time, when the cache expires
The first disadvantage is more or less fine. When we use cache it always means that we are going to show possibly old data for a certain amount of time. This is a trade-off, which typically is accepted by customers. One can of course play with the expire time. If the data is not modified very often, the cache can live longer and the other way around.
Unfortunately, the customer was still not satisfied that their users will have to experience such a long waiting time once in a while. So we had to think of better solution to the problem.
Solution 2: Keep Items In Cache Forever
Wait, what if there is a new version of the data in the web service? Sure, I don’t mean this literally š ASP.NET cache mechanism provides actually very nice extensions. One of these extensions is an option to provide a callback when inserting an item to the cache. This callback is a void function that takes 3 parameters:
- the key of the cached item
- the value of the cached item
- the reason why the item has been removed from the cache
This callback is called in a background thread when the item is removed from the cache. Having this option we can implement the following: when our items expire, we put them in the cache immediately and forever (i.e., without further expiration date). Then we make a request to the web service to get a newer version of our data and once we get a response, we update the cache again with it. It looks something like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
public IEnumerable GetItemsById(int id) { string cacheKey = "GetItemsById_" + id; IEnumerable items = _cache[cacheKey] as IEnumerable; if (items == null) { items = _client.GetItemsById(id); _cache.Insert(cacheKey, items, null, DateTime.Now.AddHours(5), Cache.NoSlidingExpiration, CacheItemPriority.Normal, OnItemsRemoved); } return items; } public static void OnItemsRemoved(string key, object value, CacheItemRemovedReason reason) { _cache.Insert(key, value); int id = Int32.Parse(key.Replace("GetItemsById_", "")); IEnumerable items = _client.GetItemsById(id); _cache.Insert(cacheKey, items, null, DateTime.Now.AddHours(5), Cache.NoSlidingExpiration, CacheItemPriority.Normal, OnCacheItemRemoved); } |
Pros
- We “always” keep our items in the cache, so users will always hit it
- The requests are done asynchronously, so users will not see any delays
Cons
- We keep possibly dirty data for a longer time
- The first request will still have to wait
Well, our only disadvantage (for now) is that we keep dirty data for around 10 more minutes. That is actually fair enough. We have kept this version of the data for 5 hours in cache, so 10 more minutes will not hurt anyone. However, the second disadvantage is still problematic. Especially when we do small deploys that reset the application pool and consequently clear the cache. To address this disadvantage, we have to involve a more reliable cache. What about a database?
Solution 3: Use Database For Caching
Yes, we can use a database where we can store the cache. ASP.NET provides the nice functionality to check for expiration and trigger a callback, so it is not a good idea to move the cache entirely to the database. Instead, we can just have our items cached two places – in memory and in a database. Whether it is a relational database of a simple version of a key-value store it does not really matter in our case. We need (only) durability from the ACID properties. The updated solution looks like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
public IEnumerable GetItemsById(int id) { string cacheKey = "GetItemsById_" + id; IEnumerable items = _cache[cacheKey] as IEnumerable; if (items == null) { items = _db.Get(cacheKey) as IEnumerable; if (items != null) { _cache.Insert(cacheKey, items, null, DateTime.Now.AddMinutes(1), Cache.NoSlidingExpiration, CacheItemPriority.Normal, OnItemsRemoved); } } if (items == null) { items = _client.GetItemsById(id); _cache.Insert(cacheKey, items, null, DateTime.Now.AddHours(5), Cache.NoSlidingExpiration, CacheItemPriority.Normal, OnCacheItemRemoved); _db.Add(cacheKey, items); } return items; } public static void OnItemsRemoved(string key, object value, CacheItemRemovedReason reason) { _cache.Insert(key, value); int id = Int32.Parse(key.Replace("GetItemsById_", "")); IEnumerable items = _client.GetItemsById(id); _cache.Insert(cacheKey, items, null, DateTime.Now.AddHours(5), Cache.NoSlidingExpiration, CacheItemPriority.Normal, OnCacheItemRemoved); _db.Add(key, items); } |
In the above snippetĀ _db is an abstraction of a database that provides methods for retrieving and updating data. The major change comes in theĀ GetItemsById method. If we don’t find our items in the cache, we check the database. Although it is slower than main memory, requests to the database are still very fastĀ (especially if it runs locally). And if we find data in the database, then we use a small trick. We add the data to the cache, so that next request will have a hit. However, we add it only for a short amount of time, e.g., 1 minute, and after that our callback will be triggered and we will get a newer version of the data.
Pros
- Users can still see some data even though the application pool has been restarted recently
Cons
- The very first request will still have to wait
- Data from database can be very old
Well, we cannot really do anything about the first disadvantage. We should fetch the data from the web service at a certain moment. We can however make some script that calls our website the first time so that the data comes to the database. Or we can do this manually, if possible. The second disadvantage can be very problematic depending on the customer’s data. If the website is constantly running and just being restarted from time to time, then there is probably nothing to worry about, the data we keep in the database cannot get that old.
In the last solution, however, there is another hidden disadvantage. I mentioned earlier that our callback is triggered in a background thread. This is perfect for us, but it can also harm us if we have many requests to the method GetItemsByIdĀ (e.g., with different IDs). It happens that the ASP.NET cache becomes very slow if there are many requests running from this background thread. That is why, it is a good idea to run the code in the callback in a separate background thread, i.e. one thread per callback (or a thread taken from the thread pool).
Conclusion
In the following I managed to explain how to address issues with integration of external systems that respond slowly. Integrating external systems very often leads to problems. However, we can do some tricks to make the user experience better and to hide these problems. These solutions can give you some ideas what you can do save end-users, but they are far from perfect. You need to do some checks for exceptions that may occur and make sure that data in the cache is updated no matter what.