With information and the technology around it continuing to increase exponentially in complexity, Big Data Migration projects increase in frequency and complexity as well. Surrounding this ever-increasing need for Big Data Migration are numerous challenges from a testing perspective, including:
- Data Management not designed into Big Data Architecture
- Hadoop Distributed File System (HDFS)-based data is difficult to extract and compare
- Voluminous data and velocity makes it difficult to test
- Lower tester productivity makes it difficult to achieve faster time-to-market for Big Data Migration
Big Data Implementation:
Intricate issues with any Big Data Migration include data quality, governance, policy adherence, and validation just to name a few. QualiDI addresses these issues and more through its unique, full-service design. QualiDI is a programming-less, test-automation tool that makes implementation of Big Data for data processing a success.
QualiDI has the ability to validate pre-Hadoop processing and compare data migrated to Hadoop with Enterprise Data Warehouse. Based on the business rules, QualiDI can create test cases including queries to read HDFS to validate the target. The results are compared in MongoDB to be displayed on Dashboards and Reports.
In addition to simplifying and streamlining the process around Big Data Migration, QualiDI also specifically allows testing teams to:
- Compare Hadoop data and any other heterogeneous data sources for test completeness
- Experience voluminous data testing with minimal testing duration
- Utilize a centralized repository of requirements, test cases, and test results
- Interface with central Quality Assurance (QA) tools (e.g. HP Quality Center, etc.)
You can download the detailed PDF version of this use case here.