Job Description - 
Brainstorm with clients/onsite and internal teams to define a problem.
Translate the business problem into an analytical problem.
Identify internal and external data requirements for solving the analytical problem.
Solving the analytical problem using concepts from mathematics, statistics, Artificial Intelligence and Machine learning.
Translate the solution to a business solution and create artefacts that can help communicate the solution to clients like dashboards, power point slides, excel sheets etc.

Expectations - 
Work hands on and provide thought leadership to real life business problem.
Use analytical thinking and apply complex mathematical techniques.
Work in a challenging environment with smart peer group.

Experience- 
At least 5 years of working experience in analytics.
Strong analytical/logical thinking/people management skills and communication skills.
Proficient in: SQL, R or Python b. Visualization tools like Tableau or SpotFire.
Experience in statistical techniques such as Regression, Clustering & Time Series Forecasting, etc.
Bachelors in Engineering or Masters in Statistics/Economics.

Location-
Bangalore

Expereince-
5 to 7

Skills Required-
SQL,R,Python,Tableau,Excel

Department-
Analytics Delivery

Designation-
Analytics Consultant

Job Description - 

Problem Formulation:
Identifies possible options to address the business problems and must possess good understanding of dimension modelling.
Must have worked on at least one end to end project using any Cloud Datawarehouse (Azure Synapses, AWS Redshift, Google Big query)
Good to have an understand of POWER BI and integration with any Cloud services like Azure or GCP.
Experience of working with SQL Server, SSIS(Preferred).

Applied Business Acumen:
Supports the development of business cases and recommendations. Owns delivery of project activity and tasks assigned by others. Supports process updates and changes Solves business issues.


Data Transformation/Integration/Optimization:
The ETL developer is responsible for designing and creating the Data warehouse and all related data extraction, transformation and load of data function in the company.
The developer should provide the oversight and planning of data models, database structural design and deployment and work closely with the data architect and Business analyst.
Duties include working in a cross functional software development teams (Business analyst, Testers, Developers) following agile ceremonies and development practices.
The developer plays a key role in contributing to the design, evaluation, selection, implementation and support of databases solution.

Development and Testing:
Develops codes for the required solution by determining the appropriate approach and leveraging business, technical, and data requirements.
Creates test cases to review and validate the proposed solution design. Work on POCs and deploy the software to production servers.

Education and Expertise- 
Minimum 4-8 years of software development experience.
Bachelor's and/or Master’s degree in computer science.
Delivered DW-ETL projects in any of the Cloud platforms (AWS / Azure / GCP).
Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore.
Excellent analytical and problem-solving skills.
Successfully delivered large scale data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile.


Technical Expertise-
Data warehouse design techniques, Data modeling, ETL tools (e.g. Informatica), SQL, Scripting Language, Power BI, Good Knowledge in any one Cloud platform (Azure, AWS, GCP)


Good to Have-
Minimum 4-8 Years of experience in Data warehouse design and development for large scale application.
Minimum 3 years of experience with star schema, dimensional modelling and extract transform load (ETL) Design and development.
Expertise working with various databases (SQL Server, Oracle).
Experience developing Packages, Procedures, Views and triggers.
Nice to have Big data technologies.
The individual must have good written and oral communication skills.
Nice to have SSIS

Location-
Bangalore

Expereince-
4 to 8

Skills Required-
Data modeling, ETL tools (e.g. Informatica), SQL, Scripting Language, Power BI, Good Knowledge in any one Cloud platform (Azure, AWS, GCP)

Department-
Technology

Designation-
Senior Software Engineer

Job Description - 
Strong grasp of computer science fundamentals and excellent problem-solving skills.
Good understanding of software engineering practices, SDLC.
Expertise with responsive web design and development.
Experience with designing high performing UI app.
Good programming skills in JavaScript (ES6 & above).
Understanding of web markup, including HTML5, CSS3.
Good experience with ReactJS.
Good communication skills.
Strong sense of ownership and accountability.
Excellent communication and teamwork skills.

Expectations-
You are self-motivated, collaborative, eager to learn, and hands on.
You love trying out new apps, and find yourself coming up with ideas to improve them.
You stay ahead with all the latest trends and technologies.
You are particular about following industry best practices and have high standards regarding quality.

Experience-
Writes high quality code & unit tests, builds, tests as per agreed timelines.
Follows and promotes SDLC best practices: Coding standard, testing, code reviews, code comments etc.
Keeps key stakeholders informed about progress and problems; avoids surprises.
Organises and expresses ideas clearly and concisely.
Exhibits strong problem-solving skills.
Exhibits excellent teamwork and helps achieve team goals.

Location-
Bangalore

Expereince-
2 to 5

Skills Required-
ReactJS, HTML5, CSS3, Javascript ES6 & above


Department-
Technology

Designation-
Senior Software Engineer

Job Description - 
Work with team to define business requirements, come up with analytical solution and deliver the solution with specific focus on Big Picture to drive robustness of the solution.
Work with teams of smart collaborators. Be responsible for their appraisals and career development.
Be part of an inclusive and open environment. A culture where making mistakes and learning from them is part of life.
See how your work contributes to building an organization and be able to drive Org level initiatives that will challenge and grow your capabilities.

Expectations-
THE IDEAL CANDIDATE WILL:

Writes high quality code & unit tests, builds, tests as per agreed timelines.
Follows and promotes SDLC best practices: Coding standard, testing, code reviews, code comments etc.
Keeps key stakeholders informed about progress and problems; avoids surprises.
Organises and expresses ideas clearly and concisely.
Exhibits strong problem-solving skills.
Exhibits excellent teamwork and helps achieve team goals.

About you
You are self-motivated, collaborative, eager to learn, and hands on.
You love trying out new apps, and find yourself coming up with ideas to improve them.
You stay ahead with all the latest trends and technologies.
You are particular about following industry best practices and have high standards regarding quality.

Experience-
Strong grasp of computer science fundamentals and excellent problem-solving skills.
Good understanding of software engineering practices, SDLC.
Good programming skills in JavaScript(E6 and above).
Understanding of web markup, including HTML5, CSS3.
Good understanding of NodeJS.
Good communication skills.
Strong sense of ownership and accountability.

Location-
Bangalore

Expereince-
2 to 5

Skills Required-
NodeJS,Javascript,HTML,CSS3

Department-
Technology

Designation-
Senior Software Engineer

Job Description - 
Proficiency in SQL/Hive [must have]:
Handling and analysing large data.
Data cleaning and processing.
R (Basic data manipulation, DPLYR, GGplot, Random forest).
Python (Basic data manipulation using pandas, matplotlib, sklearn).
Tableau/Powerbi/Qlikview.
Excellent communication skills.
Collaborative mindset to work with clients and internal teams.
Motivation to learn constantly.

Expectations-
You will:
Be part of team that solves business problems which involve:
Brainstorm with clients and internal teams to define a problem.
Translate the business problem into an analytical problem.
Solve the analytical problem using combination of Technology, Math and Domain knowledge.

You can expect to:
Be trained, guided and mentored to build a career in analytics.
Use a combination of Technology (70%), Math (20%) and domain knowledge (10%) to solve real world business problem.


Experience-
You have:
At least 2 years of work experience.
Proficiency in SQL/Hive [must have]:
Handling and analysing large data.
Data cleaning and processing.

Good experience of:
Basic operations: Joins, Union, table properties.
Windows function: rank, partition by
Parametrizing query
Optimized table format
Query optimization parameters
My-SQL query optimization

Good to Have-
R (Basic data manipulation, DPLYR, GGplot, Random forest)
Python (Basic data manipulation using pandas, matplotlib, sklearn)
Hadoop, spark
Tableau
Excellent communication skills
Collaborative mindset to work with clients and internal teams
Motivation to learn constantly

Location-
Bangalore

Expereince-
2 to 5

Skills Required-
SQL,R,Python,Tableau,Excel

Department-
Analytics Delivery

Designation-
Senior Business Analyst

Job Description - 
Role requires experience in AWS and also programming experience in Python and Spark
You Will:
Translate functional requirements into technical design.
Interact with clients and internal stakeholders to understand the data and platform requirements in detail and determine core cloud services needed to fulfil the technical design.
Design, Develop and Deliver data integration interfaces in the AWS.
Design, Develop and Deliver data provisioning interfaces to fulfil consumption needs.
Deliver data models on Cloud platform, it could be on AWS Redshift, SQL.
Design, Develop and Deliver data integration interfaces at scale using Python / Spark.
Automate core activities to minimize the delivery lead times and improve the overall quality.
Optimize platform cost by selecting right platform services and architecting the solution in a cost-effective manner.
Manage code and deploy DevOps and CI CD processes.
Deploy logging and monitoring across the different integration points for critical alerts.

Experience-
Minimum 4 years of software development experience.
Bachelor's and/or Master’s degree in computer science.
Strong Consulting skills in data management including data governance, data quality, security, data integration, processing and provisioning.
Delivered data management projects in any of the AWS.
Translated complex analytical requirements into technical design including data models, ETLs and Dashboards / Reports.
Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases.
Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing.
Successfully delivered large scale data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile.
Strong knowledge of continuous integration, static code analysis and test-driven development.
Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore.
Must have Excellent analytical and problem-solving skills.
Delivered change management initiatives focused on driving data platforms adoption across the enterprise.
Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations.

Location-
Bangalore

Expereince-
3 to 8

Skills Required-
Python, Spark, AWS, SQL

Department-
Technology

Designation-
Technology Consultant

Job Description - 
We are looking for an analytical, big picture thinker who is driven to enhance and further the mission of Company by delivering technology to internal business and functional stakeholders. You will serve as a leader to drive the IT strategy to create value across the organization. This Data Engineer will be empowered to lead the engagement to focus on implementing both low-level, innovative solutions, as well as the day-to-day tactics that drive efficiency, effectiveness and value

You will play a critical role in creating and analysing deliverables to provide critical content to enable fact-based decision making, facilitation and achievement of successful collaboration with the business stakeholders. You will analyse, design, and develop best practices business changes through technology solutions.

Expectations-
About you:

You are self-motivated, collaborative, eager to learn, and hands on.
You love trying out new apps, and find yourself coming up with ideas to improve them.
You stay ahead with all the latest trends and technologies.
You are particular about following industry best practices and have high standards regarding quality.

Experience-
Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP.
Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines.
Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning.
Experience programming in Java, Python, etc.
Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases.
Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage.

Education and Expertise-
4-7 years’ experience in IT or professional services experience in IT delivery or large-scale IT analytics projects.
Candidates must have expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have.
Expert knowledge in SQL development.
Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc).
Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines.
Experience programming in Java, Python, etc.
Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions.
Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets.

Location-
Bangalore

Expereince-
4 to 7

Skills Required-
GCP,bigdata,python,nosql

Department-
Technology

Designation-
Senior Data Engineer

Job Description - 
The role requires experience in Azure core technologies - Azure Data Lake Storage, Azure Data Lake Analytics, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight, SQL data warehouse.
You Have:
Minimum 2 years of software development experience.
Bachelor's and/or Master’s degree in computer science.
Strong Consulting skills in data management including data governance, data quality, security, data integration, processing and provisioning
Led and delivered data management projects in Azure Cloud.
Translated complex analytical requirements into technical design including data models, ETLs and Dashboards / Reports.
Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases.
Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing.
Successfully delivered large scale data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile.
Strong knowledge of continuous integration, static code analysis and test-driven development.
Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore.
Must have Excellent analytical and problem-solving skills.
Delivered change management initiatives focused on driving data platforms adoption across the enterprise.
Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations.

Expectations-
You Will:
Translate functional requirements into technical design.
Interact with clients and internal stakeholders to understand the data and platform requirements in detai and determine core Azure services needed to fulfill the technical design.
Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks.
Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs.
Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapase or SQL.
Advise clients on ML Engineering and deploying ML Ops at Scale on AKS.
Automate core activities to minimize the delivery lead times and improve the overall quality.
Optimize platform cost by selecting right platform services and architecting the solution in a cost effective manner.
Deploy Azure DevOps and CI CD processes.
Deploy logging and monitoring across the different integration points for critical alerts.

Location-
Bangalore

Expereince-
2 to 8

Skills Required-
Azure Data Lake, Data Factory, SQL database, Spark, Python, Kubernetes

Department-
Technology

Designation-
Technology Consultant

Apply For a Job

Click or drag a file to this area to upload.