Responsibilities:
Responsibilities include, but are not limited to:
Assembling large, complex sets of data that meet non-functional and functional business requirements.
Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies.
Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition.
Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues.
Establish advanced analysis and data visualization methodologies, models, and tools to derive/predict intelligence outcomes and impacts.
Implement data validation and cleaning techniques to find and fix or remove inaccurate and irrelevant data.
Develop and maintain a data warehouse populated with data from numerous sources and with varied formats.
Build, test, and maintain database pipeline architectures.
Identify ways to improve data reliability, efficiency, and quality.
Data pipeline and warehouse performance measurement and optimization.
Prepare data for predictive and prescriptive modeling in close partnership with in-house data science team.
Deliver updates to stakeholders based on analytics.
Ensure compliance with customer’s data governance and security policies.
Design, develop and maintain scaled, automated, user-friendly systems that will support the needs of the business.
Be hands-on with ETL to build data pipeline to support automated reporting.
Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.
Required Qualifications:
2+ years of relevant work experience in data science or data engineering in big data environments.
2+ years of experience in data mining and data-set preparation using SQL
2+ years of experience using data visualization software, such as, Tableau Desktop or Power BI.
Experience with data modeling, data warehousing, and building ETL pipelines.
Active/Current TS/SCI with polygraph clearance
Desired Qualifications:
Bachelor’s degree in Computer Science, Engineering, Mathematics or related technical field of study.
3+ years of experience in a data engineer or developer role with a technology company.
Experience working in large data warehouse environments.
Experience conducting large scale and complex data analysis to support data centric architecture
Experience using Python, R, Java, JavaScript, AngularJS, .NET, Hadoop, and Apache or related technologies & tools
Strong verbal and written communication and data presentation skills, including an ability to effectively communicate with both business and technical team, and senior management as required.
Bridge Core is proud to be an equal opportunity workplace and affirmative action employer. We celebrate diversity and are committed to creating an inclusive environment for all team members and applicants. At Bridge Core, we ensure fair treatment for our team members and applicants based on their abilities, achievements and experience without regard to race, national origin, sex, age, disability, veteran status, sexual orientation, gender identity or any other classification protected by law.
Software Powered by iCIMS
www.icims.com