Cloud Integration Engineering Manager Engineering - Detroit, MI at Geebo

Cloud Integration Engineering Manager

About Bedrock:
Bedrock is a full-service real estate firm specializing in innovative city-building strategies, reaching far beyond the boundaries of bricks and mortar.
Since its founding in 2011, Bedrock and its affiliates have invested and committed more than $5.
6 billion to acquiring and developing more than 100 properties in Detroit and Cleveland, including landmark developments at theHudson's Site , theBook Towerrestoration,City Modernand the May Company Building.
Bedrock's portfolio totals more than 22 million square feet of office, retail and residential space within new construction and adaptive reuse projects.
For more information on Bedrock's projects, visitbedrockdetroit.
comor engage with us onFacebook ,Instagram ,TwitterandLinkedIn.
Job Summary The Manager of Cloud Integration Engineer ing will drive quality integration work to completion with hands-on development responsibilities, and partner with the other technology and business teams to provide thought leadership and innovative solutions for Bedrock's data integration needs.
They are able to not only provide the long term vision and strategy, but also practical implementation skills related to data analysis, data tools, technical proficien cy, project management in a dynamic environment.
Responsibilities Design relational and multidimensional database structures, data flows and integration interfaces.
Modify existing data structures.
Provide technical leadership for system architecture and design.
Collaborate with technology and business teams to define standards and best practices for data integration, data cleansing, modeling, queries and data quality.
Collaborate with Data Team to implement new data management policies in coordination with existing architecture.
Design and build integration processes for various data sources :
On Prem, SaaS, Cloud API.
Develop architecture and design patterns to process and store high volume data sets.
Design and implement automated workflows using scheduling tools.
Design and develop API endpoints for internal and cloud systems.
Collaborate with technology and business teams to understand requirements, objectives , functions and performance expectations.
Identify and propose technical alternatives to resolve system problems and recommend better alternatives.
Provide production support for enterprise data warehouse and integration applications.
Perform regular performance tuning to discover process improvement opportunities.
Identify and implement code improvements for increased performance, efficiency, reliability, maintainability, extensibility and application functionality.
Unit test and debug code.
Collaborate with technology and business teams for prioritization, impact assessment and resolution.
Design and implement product features in collaboration with business and technology partners.
Oversee Developer activities :
ensure the team's work is of the highest quality and on time.
Perform design, code and test plan reviews to maintain engineering standards.
Additional duties as assigned Requirements Bachelor's degree in IT or related field, or equivalent combination of education and experience and training Minimum of 10 years experience designing and implementing integrations for data warehouses.
Experience with both relational and analytical cloud database technologies:
Oracle, SQL Server, MySQL, Postgres, Redshift, BigQuery , Synapse Experience with both ETL and ELT.
Advanced experience with multiple of the following software, languages and tools :
T -SQL, PL/SQL, AWS Glue, DataFusion , DataFactory , Mulesoft , Jitterbit , Talend.
Hands-on experience writing complex SQL and database performance tuning.
Approximately 6 years of hands-on experience in delivering enterprise-scale solutions using one or more modern programming languages such as Java, C# (.
NET Core), Python or Scala Approximately 6 years of experience in writing queries against SQL and NoSQL databases Approximately 3 years in designing, developing, and deploying APIs using loosely coupled or microservices-based architecture Strong verbal and written communication skills when interacting with management or executive teams through each phase of the agile software process Strong familiarity with all aspects of the Agile methodology and the SDLC approach, including exposure to code repos, testing, debugging and project tracking tools Prior experience as a technical lead of medium to large teams and a background in mentorship or formal management of software engineers / technical consultants A passion for crafting and delivering high quality products, building complex solutions and keeping abreast of the latest technologies Preferred Experience working with multiple file formats (Parque, Avro, Delta Lake) & API Working knowledge of essential ERP components, particularly Finance & Accounting, Inventory and CRM Prior e xperience developing ETL processes with RP and CRM data sources Experience working on CI/CD processes and source control tools such as GitHub and related dev processes Experience/Knowledge in pub/sub modes like Kafka Experience in commercial real estate industry Recommended Skills Api Agile Methodology Amazon Redshift Analytical Apache Kafka Architecture Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.