Your browser is out of date!

Update your browser to view this website correctly. Update my browser now


Hadoop has rapidly emerged as a viable platform for Big Data analytics. Many experts believe Hadoop will subsume many of the data warehousing tasks presently done by traditional relational systems. In this session, you will learn about the similarities and differences of Hadoop and parallel data warehouses, and typical best practices. Edmunds will discuss how they increased delivery speed, reduced risk, and achieved faster reporting by combining ELT and ETL. For example, Edmunds ingests raw data into Hadoop and HBase then reprocesses the raw data in Netezza. You will also learn how Edmunds uses prototyping to work on nearly raw data with the company’s Analytics Team using Netezza. Greg Rokita Director, Software Architecture Edmunds Gregory Rokita is a Director of Software Architecture at where he designs the company’s core frameworks: content and digital asset management, messaging infrastructure, search APIs and Big Data analytics. His interests include developing semi-structured data representations that allow for the unification of search and application logic, distributed systems and domain-specific data stores. Greg holds an M.S. degree in Computer Science from Stanford University where he researched large scale programming paradigms, search and databases. Krishnan Parasuraman CTO, Digital Media Netezza Krishnan Parasuraman works very closely with Netezza’s Digital Media customers in an advisory capacity and is authority on the use of big data technologies, such as Hadoop and massively parallel data warehousing technologies, towards solving analytical problems in the online digital advertising, customer intelligence and real time marketing space. Krishnan has worked in R&D, consulting and technology marketing roles within information management and enabled data warehousing solutions for large media and consumer electronics organizations like Apple, Microsoft and Kodak.