Bloomberg runs on data. As the Enterprise Data Warehouse and Business Intelligence team, we are responsible for driving analytics throughout the organization to improve our products, engage better with our customers, create greater efficiencies, and drive new businesses by providing insights into the data. Can you change the way our business works?
Our data captures the who, what, when, where and why of how our clients use Bloomberg products, how our systems execute, and how our employees engage with our customers. We are responsible for ingesting and preparing massive amounts of data for reporting, dashboards, self-service and advanced analytics. Are you motivated to build robust and scalable systems to handle petabytes of data, with billions of new data points being ingested daily?
Are you a hands-on Senior ETL architect and developer who is excited to tackle the challenges that come with working on large volumes of diverse data with complex business rules? A key objective of this role is for you to help is to build an enterprise level data analytics platform using industry standard concepts against our Data Warehouse, Data Lake, and Big Data environments. Are you excited to innovate in an environment that combines traditional warehouse technologies, MPP databases and Hadoop?
In order to be successful:
- You should have a working knowledge of industry standard Data Infrastructure (e.g. Warehouse, BI, Analytics, Big-Data, etc.) tools with the goal of providing end users with analytics at the speed of thought.
- You should be proficient at developing, architecting, standardizing and supporting technology platforms using Industry leading ETL solutions.
- You should thrive in building scalable, high throughput systems that process more than 10 billion data points a day.
- You must have strong communication, presentation, problem-solving, and trouble-shooting skills.
- You should be highly motivated to drive innovations company-wide.
You should have:
- 5+ years of experience in ETL tools. Specific expertise in implementing Informatica in an Enterprise environment is a plus.
- Advanced SQL capabilities are required. Knowledge of database design techniques and experience working with extremely large data volumes is a plus.
- Strong understanding of data warehousing methodologies, ETL processing and dimensional data modeling.
- Demonstrated experience and ability to work with business users to gather requirements and manage scope.
- Experience programming in a Linux/UNIX environment including shell scripting.
- Programming experience in Python.
We’ d love to see:
- Experience with large database and DW Implementation (20+ TBs)
- Understanding of VLDB performance aspects, such as table partitioning, sharding, table distribution and optimization techniques
- Experience working in a big data environment with technologies such as Hadoop, Spark, and HIVE.
- Programming experience, including Java, C++, ksh
- Knowledge of reporting tools such as QlikSense, Tableau, Cognos
If this sounds like you, then we want to talk! Please apply here.