Rethinking MVC Frameworks

In the previous post I explained the need for an application framework that allows us to evolve the functionality as our startup processes evolve. In this post, we will go into the technical details of what we expect from such a framework.

Almost all server side MVC frameworks provide a standard set of functionalities:

  • A way to receive a HTTP Request from the client, examine the Request and determine what the user wants to do. A URL mapper transfers control to a Controller which then does the request processing.
  • The controller fetches data from backend services (eg: databases).
  • Along the way, data transforms from relational form to object form with the help of an Object Relational Mapper (we call this a model).
  • The model data is then transformed into a view with the help of templates.
  • Several other functionalities are provided by mature MVC frameworks – sessions, caching, REST endpoints etc.

In the last decade there have been some trends that are changing the way we work with MVC frameworks. For one, clients are becoming more intelligent and thicker (logic moving from server to client where possible). This has resulted in the emergence of client side MVC frameworks.

Client side web MVC frameworks provide mechanisms by which the view (rendered in the HTML DOM) is kept in sync with the client side models. CRUD operations are typically done using AJAX, which sync the client side models with the server side using REST services. Also, we mostly rely on client side templates.

Most popular client side MVC frameworks are written to be completely agnostic of the server and JSON is the most popular mechanism by which the client and server exchange data. And then there are some frameworks which have a client and a server component which are tightly coupled.

So when we start building an application we need to choose amongst 2 alternatives:
1. Client side and Server side MVC frameworks that are agnostic of each other.
2. A single MVC framework that provides a simpler interface from the UI all the way to the database.

The pros and cons of the 2 alternatives are obvious. We prefer the first option as that helps us evolve the client and the server independent of each other and we can make use of the latest and best technologies that solve our problems.

The second major trend in recent years has been the use of NoSQL databases or NoSQL denormalization patterns in relational databases. NoSQL databases provide the benefit of near-transparent scalability. Their lack of a predefined schema is exploited by developers to rapidly evolve the schema with changing needs and helps solve expensive schema migration headaches.

While this apparent lack of schema seems like a good idea in the beginning, developers who have used NoSQL for a while would have perhaps faced issues with inconsistent schemata in their applications. The emergence of Object Document Mappers (ODMs) is a proof of this. Wait a minute! Wasn’t the lack of a schema a feature that we would boast about in a NoSQL store? Isn’t the addition of an ODM going to defeat the purpose?

The answer lies in the fact that changes to schema in a relational database is harder since it requires changes to underlying database structures, while changes to schema in a ODM only requires code changes and we can incrementally migrate old documents to the new schema without a down time.

However, the use of a NoSQL store has its own down sides. While schema evolution seems like a strength of a NoSQL store, the lack of entity relationships means that developers have to do more work to keep entities related. And then there is the question of how much of denormalization is appropriate – should we only store the primary keys or should we also duplicate a portion of the referred record in the referring record for the purpose of saving an additional lookup.

Most ORMs today, lack support for NoSQL databases and the ones that do, don’t handle the above requirements (of schema migration and denormalization) well.

So that brings us back to the original question: what features should a modern day MVC framework have?

We are looking for the following features:

  • The client and server part of the application should be agnostic of each other – in this world of ever changing frameworks, we need the ability to switch to a new client side library or a server side framework (perhaps in an entirely different language) easily. The client, the server and the database should be evolvable without requiring complete rewrite of the other parts of the stack.
  • We wanted the ability for multiple versions of schemata to co-exist for the same entity. This allows us to evolve our application very rapidly, while deciding how we want to handle old data that adheres to an earlier version of the schema.
  • Since we are talking about multiple versions of the same schema, relational databases is a no-go as such a thing cannot be implemented in a relational database easily.
  • Addition of new fields happen all the time especially in backoffice applications in a startup. If a new field addition requires us to modify even a single line of code, it may be a no-go, since this requires a redeployment of the application. We want to avoid redeployment for trivial schema changes.
  • The use of a NoSQL database shouldn’t prevent us from relating records of different entities since we believe that all knowledge of individuals, organizations and the world are part of a single global graph with different access levels.
  • We need the ability to choose amongst different migration options – eager, lazy or on the fly.
  • It is preferable to have an admin interface to perform trivial CRUD operations on any of the models that we have defined without requiring us to write a lot of code.
  • Client side data binding and AJAX is preferred over server side handling of forms.
  • The application should have an understanding of both client side and server side routes.

TAME is an experimental framework that we have built that attempts to provide the above features. In the next blog post we will go into the technical details of some of the key concepts that we have implemented in TAME.