Shared Caching in Windows Azure

advertisement
Caching Guidance
Caching is a common technique that aims to improve the performance and scalability of a system by
temporarily copying frequently accessed data to fast storage located close to the application. Caching is
most effective when an application instance repeatedly reads the same data, especially if the original
data store is slow relative to the speed of the cache, it is subject to a high level of contention, or it is far
away resulting in network latency.
Caching in Cloud Applications
There are two main types of cache commonly used by cloud applications:

An in-memory cache, where data is held locally on the computer running an instance of an
application.

A shared cache, which can be accessed by several instances of an application running on
different computers.
In-memory Caching
The most basic type of cache is an in-memory store, held in the address space of a single process and
accessed directly by the code that runs in that process. This type of cache is very quick to access; caching
can be an extremely effective strategy for storing modest amounts of static data (the size of a cache is
typically constrained by the volume of memory available on the machine hosting the process).
If you have multiple instances of an application running concurrently that use this model, each will have
its own cache. You should think of a cache as a snapshot of the original data at some point in the past; if
this data is not static it is likely that different application instances will hold different versions of the
data in their caches, so the same query performed by these instances could return different results, as
shown in Figure 1.
Figure 1
Using an in-memory cache in different instances of an application
You can implement an in-memory cache in an application by using the MemoryCache class of the .NET
Framework. For more information see the MemoryCache Class page on MSDN.
Shared Caching
Using a shared cache can help to alleviate the concern around in-memory caching where data may differ
in each cache. Shared caching ensures that different application instances see the same view of cached
data by locating the cache in a separate location, typically hosted as part of a separate service, as shown
in Figure 2. The disadvantages of this approach are that the cache is slower to access because it is no
longer held in the memory of each application instance, and the requirement to implement a separate
cache service may add complexity to the solution.
Figure 2
Using a shared cache
An important benefit of using the shared caching approach is the scalability that it can provide. Many
shared cache services are implemented by using a cluster of servers and software that distributes the
data across the cluster in a transparent manner. An application instance simply sends a request to the
cache service, and the underlying infrastructure is responsible for determining the location of the
cached data in the cluster. You can easily scale the cache by adding more servers.
Shared Caching in Windows Azure
Windows Azure provides two solutions for shared caching:

Windows Azure Cache service, which enables you to create a distributed cache that can be
shared by applications; whether these applications are implemented as Windows Azure Cloud
Services, Windows Azure Websites, or inside Windows Azure Virtual Machines.

In-role caching, which enables you to specify that a web or worker role in a Windows Azure
Cloud Service solution provides the memory for a cache. The cache is distributed across all
instances of the role, and it is managed by the Windows Azure infrastructure. The cache is
available only to the roles in the Cloud Service, but not to other applications.
For information about the caching models available in Windows Azure, visit the Windows Azure Cache
page on the MSDN website.
ASP.NET State and HTML Output Caching
ASP.NET web applications that use Windows Azure web roles can save session state information and
HTML output in Windows Azure Cache using two providers designed for this purpose:

The Session State Provider for Windows Azure enables you to share session information
between different instances of an ASP.NET web application, and is very useful in web farm
situations where client-server affinity is not available and caching session data in-memory would
not be appropriate.

The Output Cache Provider for Windows Azure enables you to save the HTTP responses
generated by an ASP.NET web application by using a Windows Azure cache. Using the Output
Cache Provider with Windows Azure Cache can improve the response times of applications that
render complex HTML output; application instances generating similar responses can make use
of the shared output fragments in the cache rather than regenerating this HTML output each
time.
For more information on using the Session State Provider and the Output Cache Provider, visit the
ASP.NET 4 Cache Providers for Windows Azure Cache page on the MSDN website.
Considerations for Using Caching
Caching is ideally suited to data that has a high proportion of reads compared to writes. Consider the
following when deciding whether to use caching in an application:

Types of data to cache

Using read-through and write-through caching

Managing concurrency in a cache

Managing data expiration in a cache

Implementing high availability and security
The following sections describe these considerations in more detail.
Types of Data to Cache
The key to using a cache effectively lies in determining the most appropriate data to cache. Caching
typically works well with data that is immutable or that changes infrequently. Examples include
reference information such as product and pricing information in an ecommerce application, or shared
static resources that are costly to construct. Caching can also be used to avoid repeated computations. If
an operation transforms data or performs a complicated calculation, save the results of the operation in
the cache. If the same calculation is required subsequently, the application can simply retrieve the
results from the cache.
Caching may be less useful for dynamic data. In this situation either the cached information can become
stale very quickly, or the overhead of keeping the cache synchronized with the original data store
reduces the effectiveness of caching.
An application can modify data held in a cache, but consider the cache as a transient data store. Do not
store valuable data only in the cache, but make sure that you maintain the information in the original
data store as well. In this way, if the cache should become unavailable, you minimize the chance of
losing data.
Using Read-through and Write-through Caching
Some commercial caching solutions implement read-through and write-through caching whereby an
application always reads and writes data by using the cache. When an application fetches data, the
underlying caching service determines whether the data is currently held in the cache, and if not the
caching service retrieves the data from the original data store and adds it to the cache before returning
the data to the application. Subsequent read requests should find the data in cache.
Read-through caching effectively caches data on demand. Data that an application does not use will not
be cached. When an application modifies data, it writes the changes to the cache. The caching service
transparently makes the same change to the original data store.
For systems such as Windows Azure Cache that do not provide read-through and write-through caching,
it is the responsibility of the applications that use the cache to maintain the data in the cache. The most
straightforward approach to implement read-through caching is to implement the Cache-Aside pattern.
You can use this strategy to implement an abstraction layer in your application code that emulates a
read-through and write-through cache.
In some scenarios, caching data that experiences high volatility without immediately persisting changes
to the original data store can be advantageous. For example, an application can modify the data in
cache, and if the application expects the data to be changed again very quickly it can refrain from
updating the original data store until the system becomes quiescent, and then save the data in the
original data store only as it appears in this quiescent state. In this way, the application can avoid
performing a number of slow, expensive write operations to the data store and the data store
experiences less contention. However, do not use this strategy if the application cannot safely
reconstruct its state if the cache is lost, or if the system requires a full audit trail of every change made
to the data.
Managing Data Expiration in a Cache
In most cases, data held in a cache is a copy of the data held in the original data store. It is possible that
the data in the original data store might change after it was cached, causing your cached data to
become stale. Many caching systems, including Windows Azure Cache, enable you to configure the
cache to expire data and reduce the period for which data may be out of date.
When cached data expires it is removed from the cache, and the application must retrieve the data from
the original data store (it can put the newly-fetched information back into cache). You can set a default
expiration policy when you configure the cache. If you are using Windows Azure Cache, you can also
stipulate the expiration period for individual objects in a cache when you store them programmatically
in the cache. This setting overrides any cache-wide expiration policy, but only for the objects specified.
Consider the expiration period for your cache and the objects that it contains carefully. If you make it
too short, objects will expire too quickly and you will lose the benefits of using the cache. If you make
the period too long, you risk the data becoming stale.
It is also possible that the cache might fill up if data is allowed to remain resident for a long time. In this
case, any requests to add new items to the cache might cause some items to be forcibly removed, in a
process known as eviction. Windows Azure Cache evicts data on a least-recently-used (LRU) basis, but
you can override this policy and prevent items from being evicted. However, if you adopt this approach
you risk your cache exceeding the memory that it has available, and an application that attempts to add
an item to the cache will fail with an exception.
Managing Concurrency in a Cache
Caches are often designed to be shared by multiple instances of an application. Each application
instance can read and modify data in the cache. Consequently, the same concurrency issues that arise
with any shared data store are also applicable to a cache. In a situation where an application needs to
modify data held in the cache, you may need to ensure that updates made by one instance of the
application do not blindly overwrite the changes made by another instance. Depending on the nature of
the data and the likelihood of collisions, you can adopt one of two approaches to concurrency:

Optimistic. The application checks to see whether the data in the cache has changed since it was
retrieved, immediately prior to updating it. If the data is still the same the change can be made,
otherwise the application has to decide whether to proceed (this business logic that drives this
decision will be application-specific). This approach is suitable for situations where updates are
infrequent, or where collisions are unlikely to occur.

Pessimistic. The application locks the data in the cache when it retrieves it to prevent another
instance from changing the data. This process ensures that collisions cannot occur, but could
block other instances that need to process the same data. Pessimistic concurrency can affect the
scalability of the solution and should be used only for short-lived operations. This approach may
be appropriate for situations where collisions are more likely, especially if an application
updates multiple items in the cache and needs to ensure that these changes are applied
consistently.
By default, Windows Azure implements an optimistic approach to concurrency based on the version
information held by an item in the cache. If you need to implement pessimistic concurrency, the
Windows Azure Caching API includes methods that enable you to lock data as it is read or written.
For more information, see Concurrency Model for Windows Azure Cache Service on the MSDN
website.
Implementing High Availability and Security
Windows Azure Cache provides a high-availability option that implements automatic failover should part
of the cache become unavailable. Additionally, if you create a cache by using the Windows Azure Cache
Service you should protect the data held in the cache from unauthorized access.
Determining whether to implement caching, deciding which data to cache, estimating the size of the
cache, and planning the most appropriate caching topology to use, is a complex and applicationspecific task. The page Capacity Planning for Windows Azure on MSDN provides some detailed
guidance and tools that you can use to determine a cost-effective strategy for caching data by using
Windows Azure Cache.
Related Patterns and Guidance
The following patterns and guidance may also be relevant to your scenario when implementing caching
in your applications:

Cache-Aside Pattern. This pattern describes how to load data on-demand into a cache from a
data store. This pattern also helps to maintain consistency between data held in the cache and
the data in the original data store.
More Information

The page MemoryCache Class on the MSDN website describes the MemoryCache class.

The Windows Azure Cache page on the MSDN website provides an overview of the different
caching options available to Windows Azure applications and services.

The page ASP.NET 4 Cache Providers for Windows Azure Cache on the MSDN website provides
more information about the Session State Provider and the Output Cache Provider for Windows
Azure Cache.

The page Capacity Planning for Windows Azure on the MSDN website provides some detailed
guidance and tools that you can use to determine a cost-effective strategy for caching data by
using Windows Azure Cache.
Download