Data Science and Analytics (DSA)

Data Engineer

Pune, Maharashtra   |   Full Time
About vConstruct:
vConstruct is a wholly owned subsidiary of DPR Construction and specializes in providing high quality Building Information Modeling and Construction Technology services geared towards construction projects. vConstruct presently constitutes of five business units: 1. Virtual designing and construction 2. Project Controls Management 3. Software Development 4. Accounting Services for Projects and 5. Data Science and Analytics
Core Values of vConstruct are:
  • Integrity is Integral
  • People are Pivotal
  • Ever Forward Spirit
 

Key Responsibilities:


  • Interface with data scientists, product managers, and business stakeholders to understand data needs and help build data infrastructure that scales across the company
  • Expertise in building out data pipelines, efficient ETL/ELT design, implementation, and maintenance
  • Work with a variety of data sources - extracting knowledge and actionable information from massive datasets
  • Implement enterprise integrations that result in a scalable, flexible, and highly available Solutions
  • Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
  • Create data models and speak to the tradeoffs of different modeling approaches
  • Perform extensive data validation/quality assurance analysis within large datasets
  • Build proactive data validation automation to catch data integrity issues
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
  • Ability to organize and lead meetings with business and operational data owners
  • Coordinate and communicate between business users and the data warehouse organization to solve business problems
  • Strong ability to troubleshoot and resolve data issues
  • Ability to build tabular and/or visualization reports as needed
  • Work independently and with team members to understand database structure and business processes
  • Work within an Agile methodology to design, develop, test, and implement analytics and reporting features and functions
  • Create queries to provide ad hoc reports, analysis, and datasets based on business needs.
  • Takes the initiative to work with other cross functional teams to solve business problems. 
 

Required Skills 

 
  • 3 to 6 years of relevant experience in one of the following areas: Data engineering, database engineering, business intelligence or business analytics.
  • 3+ years of hands-on experience in writing complex, highly optimized SQL queries across large data sets.
  • 2+ years of experience in scripting languages like Python etc.
  • 3+ years of working experience in building and optimizing data pipelines, architectures and data sets.
  • Experience with integration of data from multiple data sources like APIs, JSON and any other databases, Flat-files, Spreadsheets.
  • Ability to understand, consume and use APIs, JSON, Webservices for Data pipelines.
  • Extensive experience in Data warehouse such as Teradata/Oracle/Snowflake/Amazon Redshift, Snowflake would be most preferred.
  • Extensive experience with Data Integration tools – Azure Data Factory/ Matillion/Jitterbit /Denodo would be most preferred.
  • Design data architecture structures necessary to support BI initiatives.
  • Strong knowledge in SQL, PL/SQL or T-SQL and Data modelling Knowledge of databases like Snowflake, Microsoft SQL Server, Oracle, etc.
  • Good understanding of database design, implementation, troubleshooting and maintenance.
  • Proficient in optimizing complex SQL queries.
  • Familiarity with transactional and data warehouse environments
  • End to end experience in designing and deploying reports/dashboards/data visualizations using Power BI, Looker, SSRS, etc.
  • Hands on experience on analytics tools such as R, python and BI tools for exploratory data analytics and predictive data analytics
  • Optimize Microsoft Power BI dashboards with a focus on usability, performance, flexibility, testability, and standardization.
  • Experience working with US or overseas clients will be preferred 
  • Experience with data pipeline and data integration tools - Azure Data Factory/ Matillion/Jitterbit/Denodo, Azure Data Factory would be most preferred.
  • Experience with API integrations
  • Python in all its wonderful mutations
  • Experience with relational SQL and NoSQL databases like SQL Server, Oracle, Snowflake
  • SQL, T-SQL/PL-SQL
  • Reporting & Visualization Tools like Power BI, Looker, SSRS
  • Ability to multi-task
  • Ability to work in a collaborative team environment
  • Strong communication (oral and written) and interpersonal skills required to interact with colleagues and internal customers.
  • Excellent at troubleshooting issues
  • Ability to develop productive business relationships with internal team members through cooperation, courtesy and professionalism
  • Ability to play an integral part in project delivery given tight constraints and uncompromising quality
  • Motivated to identify and develop solutions leveraging best practices
  • Capable of explaining complex technical issues to clients and internal resources

Education


Bachelor’s or Master’s degree in Computer Science/Information technology or related field. Equivalent academic and work experience can be considered.
A bout DPR Construction:

DPR Construction is a national commercial general contractor and construction manager specializing in technically challenging and sustainable projects for the advanced technology, biopharmaceutical, corporate office, and higher education and healthcare markets. With the purpose of building great things—great teams, great buildings, great relationships—DPR is a truly great company. For more information, please visit www.dpr.com. 


Submit Your Application

You have successfully applied
  • You have errors in applying