Big Data Optimization Using Hadoop

ILW is currently performing a Hadoop install and data architecture/engineering set up to deliver data warehouse optimization and enable data science and visualization.

ILW, in partnership with Hortonworks, is installing and configuring a Hadoop environment for an upscale American retailer. The team is establishing a data management framework (ingestion, ETL), data governance policies, and security to transition the legacy data (sales inventory, product, store) from diverse data stores into Hadoop.

Key Aspects of Project

g

Hadoop installation and configuration

Hands-on big data, data science, and visualization expertise

Supply chain optimization to include maximizing efficiencies in supply distribution channels

p

Data management best practices

Quick Time to Optimization

  • Stood up first on-premise Hadoop cluster
  • After cluster handoff: 30 days from ingest start to data science work commencement
  • Ingested logistics data is in use today by the data science team to make meaningful business logistics recommendations

 

Consultants used various Hadoop technologies including, Sqoop, Spark, Oozie, Ambari, Hive, and Yarn

Interested In Working With Us?