Madrid, Maharashtra, Spain
1 day ago
Data Architect (gn)
Why SoftwareOne? SoftwareOne is a leading global software and cloud solutions provider that is redefining how companies build, buy and manage everything in the cloud. By helping clients to migrate and modernize their workloads and applications – and in parallel, to navigate and optimize the resulting software and cloud changes – SoftwareOne unlocks the value of technology. The company’s 8,900 employees are driven to deliver a portfolio of 7,500 software brands with sales and delivery capabilities in 60 countries. Headquartered in Switzerland, SoftwareOne is listed on the SIX Swiss Exchange under the ticker symbol SWON. Visit us at https://www.softwareone.com/en The role You will participate as a Data & AI Architect in our ongoing projects. What will you do? You will be responsible for designing and establishing the appropriate data architecture for our cloud-based projects (primarily AWS and GCP). This involves understanding both business and technical requirements, identifying storage, processing, and analytics needs, and designing scalable and efficient solutions using AWS and GCP cloud services. You will work closely with Data Engineers and Data Scientists to understand client requirements and challenges. You’ll provide technical guidance and advice during the implementation of data solutions, ensuring adherence to industry best practices and standards. You will evaluate and select the most suitable AWS and GCP services and tools for each project. This includes understanding the strengths and limitations of each service, as well as staying up to date with the latest trends and features in cloud-based data engineering and data science. You will design and develop reliable and efficient data pipelines using AWS and GCP’s data processing and storage capabilities. This includes integrating diverse data sources, performing data transformations and cleansing, and loading data into data warehouses or analytics platforms. You’ll provide technical advice to the Data Engineering and Data Science teams on algorithm selection, data modeling techniques, and analysis strategies. You will also participate in code reviews and ensure the quality and robustness of implemented solutions. Finally, you will stay in close contact with third-party solutions that complement the Data & AI ecosystem, ensuring this knowledge is shared and leveraged within the team. What we need to see from you If you have more than 4–5 years of experience working in the Data and AI space, particularly with experience as a Data Architect in cloud environments, we are looking for you. We need someone with a strong technical background in Data and AI, but also with the ability to understand functional business needs. We’d like to see the following: A solid understanding of data architecture principles and best practices, including appropriate data model selection, choice of data warehouses, and the implementation of efficient data pipelines. Experience designing and implementing data lakes, delta lakes, and/or data warehouses. Deep knowledge of services and tools offered by AWS (Amazon Web Services) and/or GCP (Google Cloud Platform). This includes data storage services like S3, Cloud Storage, and BigQuery; data processing tools like AWS Glue and Cloud Dataflow; and analytics and machine learning services such as AWS Redshift and GCP BigQuery ML. Programming languages: Experience with Python. Database technologies: Familiarity with different types of databases, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). ETL/ELT and data processing: Knowledge of extraction, transformation, and loading (ETL), or extraction, loading, and transformation (ELT) techniques and tools for manipulating and transforming data. Experience using tools like AWS Glue, GCP Dataflow, or Apache Spark for distributed data processing. Data analysis and visualization: Understanding of data analysis concepts and techniques, including the use of popular tools and libraries like Tableau, Quicksight, Looker, or Power BI for data analytics and visualization. Security and compliance: Awareness of best practices and measures to protect sensitive data and ensure regulatory compliance. Best practices and standards: Familiarity with industry best practices in data engineering and data science, including data quality management, data governance, code documentation, and collaborative teamwork. Communication skills: Ability to communicate effectively with both technical and non-technical stakeholders. You should be able to present ideas and technical solutions clearly and concisely and have facilitation skills to lead discussions and collaborate with team members. If your experience is mainly with Azure and you have limited exposure to AWS or GCP, we still encourage you to apply. Job Function Software & Cloud
Por favor confirme su dirección de correo electrónico: Send Email