Easily restore your project to a previous version with our new Instant One-click Backup Recovery

The evolution of APIs

We will trace APIs' roots from their beginnings to the sophisticated tools they are today.
Motunrayo Moronfolu

Motunrayo Moronfolu

May 30, 2024
The evolution of APIs

The introduction of APIs sparked a digital renaissance. Applications, once isolated, could now access more information beyond their confines. In the current age, APIs are all around us, powering a wide range of features we use daily — from weather applications to social media logins on social platforms and countless other services.

However, the ease we enjoy today was not always the experience. In this article, we will trace APIs' roots from their beginnings to the sophisticated tools they are today.

undefined

#Pre-REST (mid 1940s-2000)

Before the Representational State Transfer (REST) architectural style was introduced for designing networked applications, communications between applications were far more complex. Various methods, such as the remote procedure call and message queues, were used during this period to facilitate communication between systems and services.

Early communications (1940s-1960s)

In this era, the concept of APIs as we know them today did not exist. Instead, computers communicated via direct connections using physical mediums like punch cards or cables.

These connections were limited in scalability and often required custom programming to understand the data format being exchanged due to incompatibility between different computer systems.

Despite the limitations, this era laid the groundwork for the structured data exchange facilitated by APIs.

RPC (1960s-1990s)

Remote Procedure Call (RPC) allows programs to invoke functions on remote systems as local within a client-server architecture. RPC was developed because developers wanted to access remote machines' functionalities and needed the programs on these machines to interact with each other seamlessly. RPC offered this simplicity without the developers knowing the underlying network protocols on these computers.

RPC works like a remote control for functions. You call a local function (acting as a wrapper), but behind the scenes, an RPC framework packages your data and sends it to the server. There, another stub unpacks the data and runs the actual procedure. The result is returned, unpackaged, and delivered to your program as if it ran locally.

undefined

Remote Procedural Call

The development of RPC significantly impacted the creation of various distributed systems, such as the network file system, which allows access to remote files, and the remote method invocation, which is a Java-specific implementation for communicating between Java objects on different machines.

While RPC simplified remote procedure invocation and laid the groundwork for distributed computing as we know it today, it also had limitations. These limitations include setup complexity, programming language dependency making it difficult to use across heterogeneous systems, and the tight coupling between the client and server, meaning changes to one could require corresponding changes to the other, often hindering scalability and flexibility.

This led to the rise of Service-Oriented Architecture (SOA) in the late 1990s. SOA aimed to break down applications into smaller, modular services that could be easily integrated and reused. These services communicated using standardized interfaces, often leveraging technologies like Simple Object Access Protocol (SOAP) and Extensible Markup Language (XML).

SOAP & XML (1990s-2000s)

Before SOAP and XML, communication between applications was custom, like handwritten notes—varying from one system to another—and was prone to errors and challenging to interpret universally. There were no standardized messaging formats and protocols for communication between different platforms and programming languages, leading to interoperability and data exchange challenges. Thus, the development of SOAP and XML was initiated.

The introduction of SOAP and XML marked a significant advancement for APIs, particularly for web services and distributed computing.

SOAP defined a structured format for messages exchanged between applications, including data format, encoding, and invocation details. This standardization brought the first significant leap in API usage on any platform of choice, enabling APIs to be used across various operating systems and programming languages.

Below is some XML syntax showing a SOAP request for retrieving information about all "products.":

<?xml version="1.0" encoding="UTF-8"?>
<soap:Envelope
xmlns:soap="http://www.w3.org/2003/05/soap-envelope/"
soap:encodingStyle="http://www.w3.org/2003/05/soap-encoding">
<soapenv:Header/>
<soap:Body xmlns:m="http://www.example.api/products">
<ws:GetAllProducts/>
</soap:Body>
</soap:Envelope>

Although SOAP and XML brought the initial much-needed standardization to API design and were dominant for a period, they can also be verbose and complex to set up. The seemingly simple snippet above required about five layered structures, ranging from the envelope and body namespaces to the actual requests.

In real-world scenarios, SOAP messages encoded in XML can become more verbose for complex requests involving nested data structures depending on the actual requests. This leads to larger message sizes that require more bandwidth to parse.

These limitations, among others, led to the rise of newer, lighter-weight API styles like REST in the late 2000s.

#REST (2000s-present)

REST, created by Roy Fielding in his doctoral dissertation titled "Architectural Styles and the Design of Network-based Software Architectures", has since become a dominant architectural style for modern APIs.

One of the primary obstacles of the pre-REST era that led to REST innovation was the lack of standardization in APIs. This limitation had such implications that extended to even more challenges like:

  • Interoperability: Due to the absence of standardized protocols, each system often has its own proprietary data formats, protocols, or communication methods, making it difficult to integrate them. Even within a single organization, without a comprehensive style guide, different teams might adopt diverse formats for API design, leading to scattered and disjointed systems.
  • Vendor lock-in: This lack of standardization also led to vendor lock-in, where organizations became dependent on specific vendors for the services they provided and because of the challenging prospect of migrating to alternative solutions that might require significant rework.

These challenges made executing even the most basic tasks difficult, and the introduction of REST fixed this to a large extent.

One of REST's key features is its approach to interoperability. REST achieves this by leveraging existing web standards like HTTP verbs (GET, POST, PUT, DELETE) and URIs (Uniform Resource Identifiers). This means the URI for a specific resource remains constant, but the action performed changes based on the HTTP verb used.

This reliance on existing standardized technology, such as HTTP verbs (GET, POST, PUT, DELETE) and URIs, allows applications to communicate seamlessly regardless of the programming language used, as long as the platform supports HTTP.

This significantly simplifies onboarding new developers to the system. Since they are already familiar with HTTP verbs, they can quickly grasp the core functionalities without learning custom inter-system communication protocols.

Here is an example of a GET request to retrieve all products from a RESTful API:

GET /api/products

"GET" is the HTTP verb indicating the action we want to execute on the server—retrieval—and /api/products is the URI specifying the resource we are interested in—"products."

Following the introduction of REST principles in the early 2000s, companies like eBay, IBM, Google, Amazon, and Flickr began adopting RESTful APIs for building web services and exposing their functionalities.

Despite REST’s design simplicity, flexibility, and widespread adoption, it faces efficiency and performance issues in certain use cases. Common problems include overfetching, where REST endpoints return more data than the client needs, and underfetching, where only some necessary data is returned, requiring additional requests to retrieve related information.

REST's reliance on multiple requests and responses can also create "chatty clients," particularly for mobile applications with limited bandwidth. These factors can lead to unnecessary bandwidth consumption and processing overhead that impact system performance.

Mitigating these limitations requires close communication and collaboration between frontend and backend teams. Backend teams can carefully design and optimize APIs to return precisely the data needed by frontend clients, minimizing unnecessary data transfer and optimizing performance.

However, this approach can increase backend complexity, cause too tight coupling between the systems, and necessitate frequent, avoidable communication between both teams to ensure that any required changes to the API are promptly addressed. Consequently, this has led to a new API architectural style, GraphQL, which offers greater flexibility and efficiency in data retrieval, making it a compelling alternative to REST APIs.

#REST alternatives

While REST remains a very valid and popular API player, as examined above, there are continuous conversations about its design's unsatisfactory performance, which has prompted some new API developments. In this section, we will briefly explore two major contenders to REST - GraphQL and gRPC - and the unique advantages they provide.

GraphQL (2012-present)

GraphQL is a query language developed at Meta (previously known as Facebook) in 2012 in response to the REST API challenges, particularly concerning complex data requirements and inefficient data fetching. GraphQL changed how clients interact with APIs by allowing them to take control of data fetching, specifying precisely the data they need in a single query using a flexible language.

Below is an example of a GraphQL query that makes a single request to the GraphQL server to fetch “product” data:

query GetProducts{
products{
totalItems
items {
id
sku
name
price {
regularPrice {
value
currency
}
}
}
}
}

Another major feature GraphQL provides is a schema that defines the available data and its relationships. This schema acts as a contract between the API and the client. When the API evolves, the schema is updated to reflect the changes. This schema-driven approach makes APIs more adaptable and easier to maintain in the long run.

Following its introduction, GraphQL has experienced significant growth, establishing itself as the third most popular API architecture. Companies such as Netflix, LinkedIn, and others have adopted GraphQL as the core or part of their tech stack.

Check out this 2024 GraphQL survey from Hygraph, which provides more information on the state of GraphQL today.

gRPC (2016-present)

Google Remote Procedure Call (gRPC), initially created by Google, is an open-source framework that enhances the RPC model from the 1960s. gRPC was created because of the scalability and performance issues Google faced while building web services, particularly for microservices and real-time applications.

While existing solutions like REST and GraphQL existed, they struggled to meet Google's demanding requirements for real-time communication and high-throughput data processing, often failing to deliver responses within nanoseconds.

To address these limitations, Google created gRPC. It uses modern protocols like HTTP/2 and language-neutral data formats (Protocol Buffers) for efficient and reliable communication between applications over networks.

undefined

In addition to supporting basic communication patterns like unary (single request, single response) and server streaming (server sends multiple responses) RPCs, gRPC also introduces powerful bidirectional streaming, where both the gRPC server and client can send a stream of messages asynchronously. This flexibility enables real-time updates, event-driven architectures, and efficient data transfer in applications requiring continuous communication between client and server.

Since the introduction of gRPC, companies beyond Google, like Gitlab, Lyft, and Netflix, have leveraged it to communicate between various microservices.

Whereas both gRPC and GraphQL emerged to address the growing complexity of modern software development, gRPC excels in building high-performance, scalable web services, especially for microservices. However, due to lower development complexity, REST or GraphQL might require less API effort and thus be easier to set up for more straightforward use cases.

#Beyond just data fetching

Today, APIs are often linked with RESTful or GraphQL-style APIs as the backbone for data exchange and modern software development. However, API cases extend beyond just retrieving data; some APIs exist to provide instructions and control over functionalities within an application or environment.

An example of this broader concept is the relationship between web browsers and the fundamental web development triad: HTML, CSS, and JavaScript. While they may not be conventionally viewed as APIs, they essentially function as such, with HTML providing the structure and semantics of a webpage, CSS dictating its appearance and styling, and JavaScript enabling dynamic behavior and interactivity — all together instructing web browsers on how to render and interact with web content.

We also have web APIs, an indispensable toolkit for web developers. These APIs extend the capabilities of the web and allow developers to interact with various aspects of web browsers, operating systems, and hardware devices.

This interaction, in turn, enables developers to perform activities like creating a location-based mobile platform using the Geolocation API or performing seamless online transactions using the Payment Request API. These web APIs, among many others, help to push the boundaries of what is possible on the web.

Another notable example is the operating system APIs, which act as bridges, enabling applications to access and control a computer's underlying hardware and system resources. An example is the Camera API, which allows applications such as image platforms to integrate with a system's camera hardware, allowing users to capture photos and videos directly within their applications.

In addition, GraphQL provides subscriptions, which enable applications to push data updates to clients whenever there are changes on the server. This feature provides highly responsive and dynamic user experiences in applications like chat, social media feeds, and collaborative editing tools.

Check out this article for some of the top advantages of GraphQL in your applications.

Similarly, gRPC can also find a place in the IoT industry beyond its usage in microservices or streaming services due to its communication protocols. These protocols make interaction between IoT devices and backend systems faster and improve functionality for a truly connected world.

#Conclusion

The API evolution, from the punch card era to the sophistication of GraphQL and the gRPC modern approach, showcases a relentless pursuit of efficiency and flexibility in communication between applications. While data retrieval remains a core function, we have seen the API evolve to encompass a broader range of functionalities in hardware interaction.

With technological advancements, API technologies will continue to play a crucial role in facilitating seamless communication and interoperability between software systems. There are many reasons to prefer GraphQL, including its structured data, ease of use, type-checking, etc.

If you are considering using GraphQL in production, check out our GraphQL Report 2024, where we learn how the community solves obstacles and best practices from GraphQL experts.

The GraphQL Report 2024

Statistics and best practices from prominent GraphQL users.

Check out the report

Blog Author

Motunrayo Moronfolu

Motunrayo Moronfolu

Technical writer

Motunrayo Moronfolu is a Senior Frontend Engineer and Technical writer passionate about building and writing about great user experiences.