dbCONNECT - Securely connect to SQL databases behind firewalls from tests on the Cloud

As you build test automation for enterprise applications or large eCommerce apps you frequently have the need to access data stored in SQL databases within your corporate network behind firewalls.  You may do this to ‘set-up’ test data to get your application to a state that is ready to run the test, or to ‘clean-up’ after you have run your tests. You may also read input data to your tests from a database, validate information presented on UI against data in a database or write test results into the database.

Your only alternative until now was to write custom code to connect to these databases from your tests.  If you run your tests on the Cloud, connecting to database running on servers behind your corporate firewall poses a whole different set of challenges.

This is no longer the case!

dbCONNECT from eureQa allows you to securely connect to SQL databases behind your firewalls from tests running on eureQa – without writing any custom code.  With just a few commands you can connect to databases and perform Create, Read, Update & Delete (C.R.U.D) operations from existing tests within eureQa.

Here are some typical scenarios that you may want to use dbCONNECT in:

Verify 'Interim' Data

An insurance quoting application that shows the premium to be paid by a customer may not show the interim calculations (these can be many based on the insurance product and the coverage being quoted) in arriving at this premium or the resulting component values of the total premium that are stored in the database.  Verifying the data from these calculations can be critical to pinpointing errors.

Dynamic Test Data

eCommerce apps consume and render data that is fast changing – often multiple times a day.  Ensuring that product availability, pricing and promotions are accurately reflected on the apps takes more than just using static test data files. Access to dynamic real time data available in enterprise databases is critical to getting an independent verification of the information presented to your customers – and can be the difference between greater revenue and lost customers.

Transient 'One-Time-Use' Data

Single-use coupon codes are gaining popularity in eCommerce promotions, especially when you want to track the use of these promotions by specific customers.  Testing these promotions require your tests to have access to a set of ‘valid’ single-use coupon codes that can be difficult to generate or simulate within your test.  Looking up these codes from a database can ensure that your tests use the correct codes that can be recognized by your application.

Simulate non-UI workflow steps

Some application workflows may involve off-line or asynchronous steps that may not have an associated UI component.  In such cases end-to-end automation of the workflow from the UI can be a challenge. It may however be possible to ‘push’ the workflow along by directly updating the data or workflow status in the database.  In such cases dbCONNECT forms an important link in end-to-end automation of test workflows.

Trigger external data loads or processing

Some applications rely on receiving data from another data source or pushing data to another database (like from an OLTP schema to an OLAP schema) as part of their normal operations.  These data exchange processes may be set to occur on a predetermined schedule.  The ability to trigger these processes directly within a test ensures that the test uses the correct data across all sources and eliminates the occurrence of ‘false positives’ arising from inconsistent test data.

Test Data Partitioning with Data Filters

Testers are often faced with the challenge of having to run tests to evaluate the impact of specific data or configuration changes to their apps.  These tests typically require the use of test data that has to be specifically prepared for the test.  In most cases this data may already exist as part of a larger regression test data set.   Being able to extract this data dynamically and use it in tests will save a tremendous amount of time and effort.


The Challenge

Testing changes to Product Catalogs often involves using a subset of your test data in a test run. 

Extracting this subset from your test data however can be cumbersome, especially if it has to be done multiple times to test a variety of conditions like – product description change, SKU # change or out-of-stock condition etc.

The specific subset of data has to be dynamically selected at runtime based on dynamic data within the test


The Solution

Create Data Filters to partition the test data sets used to test the Product Catalog.

Filters can be created on one or more attributes (columns) of the data in the Product Catalog. This increases flexibility in selecting the right subset of data for tests

Filtering can be done using static data (specific Product SKU etc.) or dynamic data (Product SKUs that have been marked out-of-stock in a previous test step)

Filter criteria can be passed to the test through the REST API

Watch the Video