- Ability to define problems, collect data, establish facts, draw valid conclusions, and make recommendations for continuous improvement.
- Communicates data findings to both business and IT leaders to influence how an organization approaches and meets business challenges of an evolving customer base and changing marketplace, using strong business acumen.
- Finds and recommends new uses for existing data sources; designs, modifies, and builds new data processes; and builds large, complex data sets.
- Conducts statistical modeling and experiment design, and tests and validates predictive models.
- Builds web prototypes and performs data visualization.
- Conducts scalable data research on and off the cloud.
- Develops customized algorithms to solve analytical problems with incomplete data sets, and implements automated processes for efficiently producing scale models.
- Collaborates with database engineers and other scientists to develop, refine, and scale data management and analytics procedures, systems, workflows, best practices, and other issues.
- Implements new or enhanced software designed to access and handle data more efficiently.
- Trains the data management team on new or updated procedures.
- Writes and implements quality procedures.
- Experience using one or more programming languages (e.g. Python, Java, C++, C#, Ruby)
- Experience with big data: processing, filtering, and presenting large quantities (100K to Millions of rows) of data.
- 3+ years of quantitative experience in Logistics/Supply Chain, Transportation, Engineering or related Businesses.
- 3+ years’ experience with statistical tools (e.g. R) and analysis, regression modeling and forecasting, time series analysis. Able to write SQL scripts for analysis and reporting ( SQL, MySQL)
- Bachelor’s Degree in Mathematics, Statistics or Computer Science required. An Advanced Degree (Masters or Ph.D) highly preferred.