Von Oleksandr Lyzun publiziert am 4. April 2018

Magento Community Driven Project - Asynchronous API

One of our customers had a specific requirement in Magento:

He wanted to import about 2000 products to Magento by using the standard Magento API. For this task, he used its self-developed ESB solution, which pushed all products via the REST-API to Magento. However, the customer's ERP only can process the API requests asynchronously, so it pushed all the products in Magento simultaneously without waiting for a status or response from Magento. 

This led to several problems we had to solve for the customer. In this article you'll read how we managed to solve this problem by developing a great way to deal with large product imports.

The problems we faced

During this implementation we mentioned that default Magento REST Api is not correctly working for our case.

Problem 1: Database tables deadlocks

This problem first appeared when we tried to asynchronously import a lot of data to Magento. During the import process we continuously received deadlocks for table updates. This issue involved the tables: “url_rewrite” and “media_*”:

Example:

Serialization failure: 1213 Deadlock found when trying to get lock; try restarting transaction

Problem 2: No ERP error messages

The customers ERP system is not capable of saving and tracking error messages. It just sends requests and only checks if the request was executed successful or not. As a result, if there was error, we could not retrace how it occurred and how it could be fixed.

Problem 3: Lack of Performance

As we sent large amounts of requests to Magento at the same time, it becomes a very big problem for a database. Hundreds of requests for the creation or the change of products in Magento result in a massive database overload that may lead to less performance which affects conversions or the user experience in the shop if this would happen on a live database. 

Problem 4: ERP dependency

For massive imports the ERP system have to stay online till the process is finished to track responses and push the products one by one.

Our interim solution 

Since Magento didn’t meet our requirements at this point, we came up with another solution:
We developed a middleware that we placed between Magento and the customer’s ERP system to catch all API requests that are to be sent to the Magento API.

In the next step the system forwards them to RabbitMQ. RabbitMQ takes them from a queue and executes one by one. Using this method allowed us to highly reduce the system load, prevented any Deadlocks in the tables and made it possible to implement additional logs for tracking all the requests to react faster if any error occurs.

ERP - Middleware - Magento

 

Merging the solution with Magento

Since our solution worked as planned and fixed the problems for our customer, we came up with the idea that the same middleware functionality could be directly implemented in Magento API.

The advantage would be to import large amounts of date without the need of a middleware solution, which would result in less time and cost for new implementations.

This approach was next discussed with the Magento Team and we all agreed that this solution would be a great feature for the Magento Commerce system and could be implemented as a part of the Magento Contribution program.


In this video we demonstrated new feature of the Magento -asynchronous APIs working over message queue.

Do you want to become a part of the Magento Contributors Team and meet new people who have the same passion as you?

Then join us on the Magento Contribution Day
in Frankfurt/Main on the 12th of May 2018 at comwrap 

Register now

API Changes

Within the scope of the implementation we worked together with Balance Internet (https://www.balanceinternet.com.au/) on the BULK API project. comwrap took on the tasks of the asynchronous and Bulk API implementations.

The main tasks:

  • Extending the Magento WebAPI module in order to add new URL routes to the Magento Rest API. In our case we added a “async”-prefix for all API calls, so Magento knows which request is synchronous and which is asynchronous
  • Create new route processor for asynchronous requests. This processor had to catch incoming messages, validate them and push them to the queue (RabbitMQ in our case).
    • all POST requests
    • all GET requests
    • all PUT requests
    • exclude all GET requests 
  • Develop a consumer that will read messages from the queue, process them, write the results to the corresponding Magento table and update the operation status
  • Create new status endpoints for delivering the results of the executed requests
  • Extend the swagger scheme in order to work with the new API implementation


As a result of this implementation we got new Magento Asynchronous Rest API URLs: 

http://MAGENTO_URL/rest/async/V1/products/
http://MAGENTO_URL/rest/default/async/V1/products/

So if you use the “async” prefix for your API requests, all calls will go to RabbitMQ queue and then will be executed by the consumer one-by-one.

The output result will then look like this:

  "bulk_uuid": "b8b79af4-fe6a-4f8a-a6f3-76b6e95aeec8",
  "request_items": {
    "items": [
      {
        "id": 0,
        "data_hash": null,
        "status": "accepted"
      }
    ]
  }
}


Where "bulk_uuid" can be used as a ID for tracking the operation status.

This command is used to run the asynchronous queue execution:
bin/magento queue:consumers:start async.operations.all
 

The whole integration looks like this:

Magento - Asynchronous APIStatus Endpoints

In addition to asynchronous operations we also implemented new status endpoints:

Status Short

Returns back the status of the operation:
GET /V1/bulk/status-short/:UUID 

with response:

{
  "operations_list": [
    {
      "id": 0,
      "status": 0,
      "result_message": "string",
      "error_code": 0
    }
  ],
  "operations_counter": {
    "operations_total": 0,
    "open": 0,
    "operations_successful": 0,
    "total_failed": 0,
    "failed_not_retriable": 0,
    "failed_retriable": 0,
    "rejected": 0
  },
  "extension_attributes": {},
  "bulk_id": "string",
  "description": "string",
  "start_time": "string",
  "user_id": 0,
  "operation_count": 0
}


Status Detailed

Returns information about operations status and also a detailed response of each operation
GET /V1/bulk/status-detailed/:UUID

with response:

{
  "operations_list": [

    {
      "id": 0,
      "topic_name": "string",
      "status": 0,
      "result_serialized_data": "string",
      "result_message": "string",
      "error_code": 0
    }
  ],
  "operations_counter": {
    "operations_total": 0,
    "open": 0,
    "operations_successful": 0,
    "total_failed": 0,
    "failed_not_retriable": 0,
    "failed_retriable": 0,
    "rejected": 0
  },
  "extension_attributes": {},
  "bulk_id": "string",
  "description": "string",
  "start_time": "string",
  "user_id": 0,
  "operation_count": 0
}

The Results

After three months of work in cooperation with the magento team we could release the new feature for the Magento Open Source system and got second place in Magento Contributors ranking for Q1 2018 (https://magento.com/magento-contributors#partners).  

For us as developers at comwrap it was a truly great experience working together as one big team and participate in the Magento Contributions days to improve our personal and also general team skills and developing this new amazing feature that we can use for our clients in future.

 

Do you want to become a part of the Magento Contributors Team and meet new people who have the same passion as you?

Then join us on the Magento Contribution Day 
in Frankfurt/Main on the 12th of May 2018 at comwrap 

Register now