Home » Java » java – How to create a reusable Map-Exceptionshub

java – How to create a reusable Map-Exceptionshub

Posted by: admin February 25, 2020 Leave a comment

Questions:

Is there a way to populate a Map once from the DB (through Mongo repository) data and reuse it when required from multiple classes instead of hitting the Database through the repository.

How to&Answers:

As per your comment, what you are looking for is a Caching mechanism. Caches are components which allow data to live in memory, as opposed to files, databases or other mediums so as to allow for the fast retrieval of information (against a higher memory footprint).

There are probably various tutorials online, but usually caches all have the following behaviour:
1. They are key-value pair structures.
2. Each entity living in the cache also has a Time To Live, that is, how long will it considered to be valid.

You can implement this in the repository layer, so the cache mechanism will be transparent to the rest of your application (but you might want to consider exposing functionality that allows to clear/invalidate part or all the cache).

So basically, when a query comes to your repository layer, check in the cache. If it exists in there, check the time to live. If it is still valid, return that.

If the key does not exist or the TTL has expired, you add/overwrite the data in the cache. Keep in mind that when updating the data model yourself, you also invalidate the cache accordingly so that new/fresh data will be pulled from the DB on the next call.

Answer:

You can declare the map field as public static and this would allow application wide access to hit via ClassLoadingData.mapField

I think a better solution, if I understood the problem would be a memoized function, that is a function storing the value of its call. Here is a sketch of how this could be done (note this does not handle possible synchronization problem in a multi threaded environment):


class ClassLoadingData {

  private static Map<KeyType,ValueType> memoizedValues = new HashMap<>();


  public Map<KeyType,ValueType> getMyData() {

    if (memoizedData.isEmpty()) { // you can use more complex if to handle data refresh

     populateData(memoizedData);

    } else {
      return memoizedData;
    }

  }

  private void populateData() {

    // do your query, and assign result to memoizedData

  }

}



Answer:

Premise: I suggest you to use an object-relational mapping tool like Hibernate on your java project to map the object-oriented
domain model to a relational database and let the tool handle the
cache mechanism implicitally. Hibernate specifically implements a multi-level
caching scheme ( take a look at the following link to get more
informations:
https://www.tutorialspoint.com/hibernate/hibernate_caching.htm )

Regardless my suggestion on premise you can also manually create a singleton class that will be used from every class in the project that goes to interact with the DB:

public class MongoDBConnector {

    private static final Logger LOGGER = LoggerFactory.getLogger(MongoDBConnector.class);

    private static MongoDBConnector instance;

    //Cache period in seconds
    public static int DB_ELEMENTS_CACHE_PERIOD = 30;

    //Latest cache update time
    private DateTime latestUpdateTime;

    //The cache data layer from DB
    private Map<KType,VType> elements;

    private MongoDBConnector() {
    }

    public static synchronized MongoDBConnector getInstance() {
        if (instance == null) {
            instance = new MongoDBConnector();
        }
        return instance;
    }

}

Here you can define then a load method that goes to update the map with values stored on the DB and also a write method that instead goes to write values on the DB with the following characteristics:

1- These methods should be synchronized in order to avoid issues if multiple calls are performed.

2- The load method should apply a cache period logic ( maybe with period configurable ) to avoid to load for each method call the data from the DB.

Example: Suppose your cache period is 30s. This means that if 10 read are performed from different points of the code within 30s you
will load data from DB only on the first call while others will read
from cached map improving the performance.

Note: The greater is the cache period the more is the performance of your code but if the DB is managed you’ll create inconsistency
with cache if an insertion is performed externally ( from another tool
or manually ). So choose the best value for you.

public synchronized Map<KType, VType> getElements() throws ConnectorException {
   final DateTime currentTime = new DateTime();
   if (latestUpdateTime == null || (Seconds.secondsBetween(latestUpdateTime, currentTime).getSeconds() > DB_ELEMENTS_CACHE_PERIOD)) {
       LOGGER.debug("Cache is expired. Reading values from DB");
       //Read from DB and update cache
       //....

       sampleTime = currentTime;
   }
  return elements;
}

3- The store method should automatically update the cache if insert is performed correctly regardless the cache period is expired:

public synchronized void storeElement(final VType object) throws ConnectorException {
  //Insert object on DB ( throws a ConnectorException if insert fails )
  //...

  //Update cache regardless the cache period
  loadElementsIgnoreCachePeriod();
}

Then you can get elements from every point in your code as follow:

Map<KType,VType> liveElements = MongoDBConnector.getElements();