Thursday, June 22, 2017

MuleSoft mocking database endpoints in MUNIT test

1.0 Overview

We will create a sample mule application that accepts HTTP request and queries the full results from a table in a database. I will use Derby in memory DB for this demo. The next section will talk about how you could set up the database. Please click on the following link to read more about Apache Derby.
Most of the online MuleSoft tutorial teaches you how to run Derby DB in embedded mode, here I will teach you to run it in server mode, which means you do not need to set up any spring bean and create any customized database initialization java codes, no none of these jiggery pokery stuff (as any kiwi would say, aye).
The other benefit of running a Derby in server mode is that you can keep reusing the database for any lab work you plan on doing, and I could keep reusing it to teach you new stuff.
The other reason I am writing this article is that I have posted a question to forums asking if we could mock database end points, the answers that I am getting ranges from nope, to may be, to I don’t know. The answer to the question is a resounding yes, you could absolutely mock a database endpoint response, and if you follow me through the article I will show you how this can be done.
Section 4.2 will show you how this can be done directly but I urge you to go through the whole material so that you could follow the story, as it builds up quite nicely bit by bit to section 4.2, so let the adventure begin.

2.0 Creating the Derby Demo DB

I am setting up the Derby in memory database from a windows environment, for more details you can refer to the following link.

  1. Execute the following command in command prompt
java -jar %DERBY_HOME%\lib\derbyrun.jar ij
  1. The Execute the following command
CONNECT 'jdbc:derby:DemoDb;create=true;user=me;password=mine';
Notice that after you have executed the command you will see a DemoDb folder reflecting your execution.
  1. Next you need to execute the following SQL file, you can get the SQL file from github.
You can just click on the link, copy the SQL scripts and paste it into the command prompt and press enter.
  1. In order to test if everything is successful execute the following select statements
select * from employees;
If your tables are created successfully you will see the following printed in your command prompt.

2.1 Starting the Derby Demo DB

  1. Before you start the DB in the same command prompt you need to set the derby.system.home environment variable. Execute the following command.
set derby.system.home="C:\apache\db-derby-\DemoDb"
  1. Run the following command
  1. Starting up the DB server, open a new command prompt, execute the following command.
startNetworkServer -h -p 1527
The following will show that server is started.

3.0 Creating the Mule Application

The mule application that we will be creating is pretty straight forward it connects directly to the Derby database and it queries the employee table.
It is a simple application depicted by Figure 3.0a
Figure 3.0a
The solution files will contain both the implementation and the global.xml mule configuration files as depicted in the following diagram.
Figure 3.0b
I separated out the global configuration from the actual mule flow implementation, so that I could show you how you can conduct integration testing on a flow that have multiple dependencies across different mule configuration files.  
The global.xml mule configuration files has no mule flows in them, it is only populated with global configurations. The following illustration (Figure 3.0c) depicts only 2 global configuration items.
Figure 3.0d
The generic database configuration item looks like the following, there is nothing being configured in any of the other tabs.  I have only configured the generals tab.
Figure 3.0e
At the bottom of the configuration screen there is a “Test Connection” button, click on it to confirm if the database has been started, if database has already been started you will see the following dialog (Figure 3.0f).
Figure 3.0f

3.1 Sending HTTP request to the Mule Application

This is a simple mule application, every time when there is a HTTP request it will log/insert a record to the CALL_LOG table, and it would then proceed to select all records from the employee table (from the derby database).
Let’s fire the mule application up and I will show you what I mean.
Open up Postman and issue a GET request to the following URL.
You do not need to key in any query parameter, your Postman screen will look like the following (Figure 3.1a).
Figure 3.1a
Once you have click send you will get back a JSON payload that is full of employee data.
Figure 3.1b
On your Any point studio mule console you will see the following being logged.
Figure 3.1c
Notice the “executionTimeStamp” log, now open another windows command prompt and execute the following command.
java -jar %DERBY_HOME%\lib\derbyrun.jar ij
Which will bring up te ji console for derby database connection.
At the ij console type the following command and press enter
connect 'jdbc:derby://localhost:1527/DemoDb';
You will now be connected to the derby database. Once in I want you to execute the following select statement.
Select * from call_log;
You will see the following selected result set Figure 3.1d.
Figure 3.1d
Notice that the “executionTimeStamp” is the same as what is printed in the Anypoint console log.

4.0 Creating MUNIT test

Now that we have a clear understanding of what the created Mule application is meant to be doing, we will then proceed to create a MUNIT test for it. As depicted at Figure 4.0a, I need you to right click on the flow and from the menu go to Munit, and select “Create new munitdbmocking.xml suite”.
Figure 4.0a
Immediately following that mouse click earlier you will see a Munit flow being created for you.
Figure 4.0b
If you go to the global elements tab (in Figure 4.0c), you will see that munit has only imported one mule configuration.
Figure 4.0c
We must now create another import to also include the global.xml, in order to do this you need to click on the create button, select bean and under it select import (Figure 4.0d).
Figure 4.0d
Type in “classpath:global.xml” in the text box (Figure 4.0e)
Figure 4.0e
Now we have all the dependencies sorted, we need to construct the HTTP initialization message (as if we are executing the mule application from postman) Figure 4.0f.
Figure 4.0f
Next I add in an assert payload. I have created a payload JSON file in test resources named “expectedPayload.json”, and referred to it in the “Assert Payload” MUNIT message processor. The reason I am doing this is so that I do not obscure the MUNIT configuration files with unnecessary hardcoded payload values.

4.1 Running the MUNIT test

Next we are going to run the MUNIT test, but before we do that I want you to delete all records from the CALL_LOG table.
Right click on the blank area of the MUNIT configuration widow and select run as in Figure 4.1a
Figure 4.1a
After a successful MUNIT test run you will have a test windows that shows green, this means all your test have executed successfully (Figure 4.1b).
Figure 4.1b
Now if you go back to the command prompt console, try and select records from the call_log table.
You will notice that the MUNIT test have executed an integration test for you and have written to the call_log table. Now this is inconsequential if you are doing a lab exercise, imagine if this would happen in production, where you have accidentally executed MUNIT test against production configuration. Imagine the horror of finding out all your production systems populated with MUNIT test data, it’s horrible it’s utterly horrible, it give me the mere possibility of that happening gives me the heebie jeebies.

4.2 Mocking Database Endpoints

This is why we must mock certain endpoints and this is why we are here not to mock the database. Endpoints. Let’s mock the end point where we are writing to the database. Drag a Mock message processor to the MUNIT test case (figure 4.2a) and select the highlighted message processor.
Figure 4.2a
When we tested the mule application in section 3.1 we could see from the console log that an insert DB operation would create a payload of “1”, so let’s configure this into the mock, and click save. Upon completion of your mock configuration would look similar to the following depiction (figure 4.2b).
Figure 4.2b
With the mock configure now lets rerun the MUNIT test, the Anypoint console would display the same format of log messages as per our previous MUNIT test (Figure 4.2c).
Figure 4.2C
Now if you go back to the ij derby console and run a select against the CALL_LOG table, you will see that there is no new records being created (Figure 4.2d). We have successfully mock database insert message processor sparing MUNIT from needing to execute the real insert DB operation in the original flow implementation.
Figure 4.2d

5.0 Conclusion

The Mock message processor in MUNIT can be used to mock any message processor, you just need to import all the dependant mule configuration XML files. I was having issues doing this when I first started, I couldn’t find the message processor I wanted to mock, and I found out the root cause of it was because I have not imported the dependant mule XML configuration. I am sure this mistake trips a lot of mule developers up. If you have read this article in its entirety you will be spared of the mistake.
The full source code including the MUNIT test case can be obtained from the following link:-


Friday, June 16, 2017

MuleSoft Integration Testing with Postman and Newman CLI

1.0 Overview

There are two types of testing when it comes to integration software development and they are namely unit testing and integration testing.

Unit testing in mule can be realized by Junit testing and Munit testing. You could actually use Munit to do integration testing to a certain degree. Integration testing could also be done by using SOAP UI and CURL.

But in this article we are going to talk about integration testing using Postman and Newman CLI, I will use the following sections as a step by step walk through to teach you on how this can be done.

2.0 Creating the Mule Application

The mule application we are going to create is a simple flow with a HTTP receive endpoint, as shown in Figure 2.0a

Figure 2.0a
Figure 2.0b shows the content/configuration of the logger message processor.

Figure 2.0b
The following is the full Mule XML configuration for the application.
<?xml version="1.0" encoding="UTF-8"?>

<mule xmlns:http="" xmlns="" xmlns:doc=""
    <flow name="postmanautomatedtestFlow">
        <http:listener config-ref="HTTP_Listener_Configuration" path="/postman" doc:name="HTTP"/>
        <byte-array-to-string-transformer doc:name="Byte Array to String"/>
        <logger message="Postman Automated Test #[payload + &quot; &quot;] #[]" level="INFO" doc:name="Logger"/>
</mule> is unique and it will be created for each new request. I have purposefully logged it down to show you that each time you see a log in the console it would mean that a new external request has been triggered.

Once you have create a similar HTTP receive endpoint flow like mine in Anypoint studio, proceed to start the application.

3.0 Testing with Postman

I have create a youtube video on testing mule application with postman, you can view the video if you want to see it, otherwise just read my following instructions.
1)      In order to test the created mule application you need to start up Postman and point it to the following URL http://localhost:8082/postman, change from get to post and put the string literal “testing” in the payload text area.

2)      The next step is to click send, you will then now get a 200 response, with the same string literal that you have previously entered because the mule application does nothing but logs your request to the anypoint studio console, now notice the status code that is returned is 200 OK.

3)      Next we are going to the test, at the top of Postman screen just below the address bar as shown in the following picture.

4)      Once you are in the test, text area key in the following assert statements.
tests["Status code is 200"] = responseCode.code === 200;
It will look like the following print screen. There are a lot assert snippets in which you could explore on the right side of your screen.

5)      Now when you click on send button again notice that now you will see Test(1/1) at the bottom of the result pane.

6)      If you change your assert response code to 400 instead of 200, and you click send again, you will notice test is now Test(0/1), which means test has failed, change the assert back to 200 as stated in step 4)

7)      Now save the test into a collection, go to the top right conner of the Postman screen click on the save button and select the save as menu.

8)      A save request pop up dialog would appear, now key in the following inputs, and proceed to click save.

9)      Once you have save your test into a collection, notice that you will see, the previously saved collection and the test on the far left, collections tab as depicted in the following illustration.

10)   Now I need to you export the saved collection into a JSON file, this is so that we could use Newman Command Line Interface (CLI) to trigger it. I order to export the save test into JSON file, click on the ellipsis button and select export from the menu.

11)   An Export Collection dialog box would pop-up, here select Collection V2 radio button and click export button, and save to a folder that is easily accessible, we will use this file in the following section.

4.0 Newman CLI Testing using Postman collection export

If you don’t have Newman installed you need to install it via NPM, if you don’t have NPM installed you need to install it first. I am using window lap top so I am going to launch the windows command prompt to execute Neman CLI.
Navigate to the folder containing your exported JSON file, if you open the JSON file you will see the following as the contents of the file.
                        "variables": [],
                        "info": {
                                                "name": "PostmanAutomatedTestDemo",
                                                "_postman_id": "75d716b9-e83b-ae2c-91dd-d5ae7ccf4327",
                                                "description": "",
                                                "schema": ""
                        "item": [
                                                                        "name": "TestingHTTPMuleFlow",
                                                                        "event": [
                                                                                                                        "listen": "test",
                                                                                                                        "script": {
                                                                                                                                                "type": "text/javascript",
                                                                                                                                                "exec": [
                                                                                                                                                                        "tests[\"Status code is 200\"] = responseCode.code === 200;"
                                                                        "request": {
                                                                                                "url": "localhost:8082/postman",
                                                                                                "method": "POST",
                                                                                                "header": [],
                                                                                                "body": {
                                                                                                                        "mode": "raw",
                                                                                                                        "raw": "testing "
                                                                                                "description": ""
                                                                        "response": []

Launch windows command prompt and execute the following command.
newman run PostmanAutomatedTestDemo.postman_collection.json -n 10

Once you have finished newman has finished running your test the following result table would be printed on your command prompt.

The “–n 10” option in the newman cli means run this test 10 times, you will be able to see the same thing being reflected at the anypoint console where by 10 lines will be logged, and each line has a unique message id.

5.0 Conclusion

I have shown you in a few simple steps you could set up automated integration testing, one practical thing you could do with this is load testing. If you have an API that is exposed, via a HTTP receive endpoint, you could essentially simulate load testing by running few command prompts concurrently each with a few hundred or even thousands of iteration.