Description & Requirements
Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock – from around the world. In the Data department, we are responsible for delivering this data, news, and analytics through innovative technology — quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies and implement technology solutions to enhance our systems, products, and processes — all while providing platinum customer support to our clients.
- Design and deliver training on applied experimentation and causal reasoning that enables teams to evaluate process changes - such as adopting new data pipelines, switching validation methods, or implementing AI-assisted workflows - and quantify their impact on dataset quality and business outcomes.
- Build a curriculum on experimental design, A/B testing, and hypothesis testing for data operations and teach teams to run controlled experiments to quantify improvements based on workflow changes
- Design and deliver analytics and statistics training that strengthens quantitative reasoning, data quality assessment (accuracy, completeness, reliability), and AI-enhanced insight generation.
- Create hands-on labs where teams design experiments on real Bloomberg datasets—testing pipeline changes, evaluating new tools, and measuring quality improvements using statistical methods.
- Explain core statistical concepts (sampling, correlation, causation, p-values) in the context of data quality and process optimization.
- Incorporate AI-assisted tools (e.g., GitHub Copilot, ChatGPT, NotebookLM) into training design and delivery.
- Ensure teams maintain the highest standards for data quality, observability, and governance, alongside the implementation of transformative AI technologies.
- Create structured guides and reusable frameworks (experiment templates, statistical calculators, decision tools) that enable teams to independently design experiments and adopt new tools and scale impact across the organization.
- Partner with engineers and domain experts to ensure we’re meeting client needs and leveraging the best technology solutions.
- Develop self-service materials that enable teams to independently design experiments and adopt new tools.
- Stay current with emerging experimentation methods, AI tools, and financial market dynamics—continuously refining curricula to meet evolving Data organization needs and business priorities.
- Commitment to cultivating a continuous learning culture across technical teams.
- 3+ years experience in data analytics or statistics with hands-on experience designing and analyzing experiments (A/B tests, causal inference studies, process optimization trials) within data-centric environments
- Bachelor’s degree or higher in Computer Science, Engineering, Data Science, or other data-related field.
- Strong foundation in experimental design and statistical inference: hypothesis testing, confidence intervals, power analysis, p-values, correlation vs. causation, and when different methods apply
- Proficiency with statistical analysis in Python or R, including experimentation libraries (scipy, statsmodels, scikit-learn) and data manipulation tools (Pandas, SQL)
- Experience mentoring or teaching technical material, with a passion for continuous learning and knowledge sharing. Ability to translate technical concepts into clear learning content and documentation.
- Strong communication and teaching abilities— a proven track record explaining complex quantitative concepts to both technical and non-technical audiences through clear examples and hands-on exercises.
- Ability to identify learning needs through stakeholder consultation and translate them into scalable, practical training solutions
- Understanding of data quality metrics (accuracy, completeness, timeliness) and concepts (data observability, governance) and how to assess them through statistical methods
- Proven problem-solving skills and adaptability in evolving, fast-paced environments.
- Collaborative approach to partnering across global teams and aligning with business priorities.
- Effective project management skills to develop and manage a roadmap and deliver milestones in a timely manner.
- Ability to flexibly adapt to a changing environment.
- Interest in financial market datasets and their application to data solutions.
- Familiarity with modern data tools and frameworks (e.g., Airflow, Dagster, dbt, Spark, cloud data platforms)
- Active engagement with professional or academic communities in data science, analytics education, or applied experimentation.
- Certification in DAMA CDMP,EDM DCAM or similar.
- Examples of technical content you've created—whether documentation, tutorials, presentations, or internal training materials
- Hands-on experience with financial data, market data, or other business-critical datasets


