` Career Opportunities | Cloud Collab Technologies

Career Opportunities

CloudCollab is helping companies build the future of Telecommunications. We invite you to be part of this journey by joining our dynamic team in taking the world there.

A developer is responsible for several Java-related duties throughout the software development lifecycle, from concept and design to testing. The developer is required to create user information solutions through the development, implementation, and maintenance of Java-based components and interfaces.

Position Requirements:


  • As a technical team member work with the Offshore / US project team / client on engagements focussed on Oracle EBS / Cloud application.
  • Understand customer business processes / Functional Specification and be able to prepare technical design.
  • Provide Technical support for Oracle Integration Cloud, data conversion and reports.
  • Develop and unit test technical components as per standards.
  • Participate in automation and digital transformation activities within or outside the client projects.

Desired Knowledge:


  • Good knowledge of modules and processes around Oracle Finance/SCM application.
  • End to end implementation experience in oracle cloud / ERP applications.
  • Excellent communication skills and ability to interact with external teams or clients.
  • Experience in working with client / USA counterparts in understanding their business requirements and providing the right solutions.

Must have Skills:


Candidate should possess strong knowledge in below 3 or more areas:

  • SQL and Pl/SQL
  • Data migration using FBDI
  • Oracle SaaS BI / OTBI / FR reports
  • Cloud Integration ( ICS / OIC)
  • Oracle VBCS / APEX

Good to have skills:

  • Knowledge on emerging technologies like ( RPA, IoT, Blockchain)

QA Automation engineer who understands basics of DevOps [pipeline structures, Jenkins]

Key Skills:


  • Proficiency in Java, with a good understanding of its ecosystems
  • Expert in Selenium
  • Expert in REST Assured

NOTE: Don’t require a framework as we’re using TAP.

Key Skills:


  • Frontend: HTML(Bootstrap 4 & 5), CSS/JS mandatory , Angular/React optional.
  • Backend: Python with Flask framework is mandatory.
  • DevOps: Basic knowledge of cloud and infra services with CI/CD & deployment pipelines.

Preferred Skills:


  • Hands on in containers (docker)
  • Hands on in orchestration (Kubernetes)
  • Hands in CI/CD tools (Jenkins, groovy, git)
  • Hands in Monitoring tools (Prometheus, Grafana, Kibana and Elastic)
  • Hands on in Chef and Ansible
  • Scripting - Bash/ python
  • Knowledge on GKE or AKS
  • AWS / GCP preferred for cloud

SRE 1


  • Hands on in containers (docker)
  • Hands on in orchestration (Kubernetes)
  • Hands in CI/CD tools (Jenkins, groovy, git)
  • Hands in Monitoring tools (Prometheus, Grafana, Kibana and Elastic)
  • Hands on in Chef and Ansible
  • Scripting - Bash/ python
  • Knowledge on GKE or AKS
  • AWS / GCP preferred for cloud
  • Good troubleshooting skills

As the Security Lead Engineer, you will be responsible for leading and implementing security measures to safeguard our organization's systems, networks, and data. You will play a pivotal role in securing our infrastructure, detecting and mitigating threats, and ensuring compliance with security best practices and standards.

Key Responsibilities:


  • Lead the design, implementation, and maintenance of security measures, including SIEM, network security, vulnerability management, and monitoring solutions.
  • Develop and enforce security policies, procedures, and standards to protect the organization from cyber threats.
  • Collaborate with cross-functional teams to integrate security into the software development lifecycle, network architecture, and cloud environments.
  • Manage and configure SIEM tools (e.g. QRadar, Splunk, Elastic) to collect, analyze, and correlate security data from various sources.
  • Oversee network security measures, including firewalls, intrusion detection/prevention systems, and access control mechanisms (e.g., WAF, IDS/IPS, Tenable.io, Nessus)
  • Conduct vulnerability assessments, penetration tests, and security audits to identify and remediate security weaknesses.
  • Lead incident response and investigation efforts to address security incidents and breaches promptly.
  • Implement and manage security monitoring and alerting systems to detect and respond to security events in real-time.
  • Stay current on emerging threats, vulnerabilities, and security technologies to recommend and implement security enhancements.
  • Mentor and provide guidance to junior security team members.
  • Collaborate with third-party security vendors and assess their services for alignment with organizational security goals.

Qualifications:


  • Bachelor’s degree in computer science, Information Security, or a related field (Master's preferred).
  • 6+ years of experience in information security roles with a focus on SIEM, Network Security, Vulnerability Management, and Monitoring.
  • Proficiency in configuring and managing SIEM solutions (e.g. QRadar, Splunk, Elastic).
  • Strong knowledge of network security principles and best practices.
  • Experience with vulnerability scanning tools (e.g. Tenable.io, Nessus, Qualys) and penetration testing.
  • Familiarity with cloud security concepts and practices (e.g. IBM Cloud, AWS, Azure, GCP).
  • Security certifications such as CISSP, CISM, or GIAC are a plus.
  • Excellent problem-solving skills and a deep understanding of security frameworks and standards.
  • Strong communication and cross-functional and team leading abilities.
  • Commitment to continuous learning and staying updated on security trends and threats.

We are looking for an experienced API developer using Apigee to join our team. The ideal candidate will be responsible for designing, developing, and maintaining APIs that enable communication between various software applications. The API developer will work closely with cross-functional teams to understand business requirements and translate them into API specifications.

Key Responsibilities:


  • Design, develop, and maintain APIs using Apigee
  • Collaborate with cross-functional teams to understand business requirements and translate them into API specifications
  • Implement security measures to protect APIs from unauthorized access
  • Test and debug APIs to ensure they are working correctly
  • Monitor and troubleshoot production APIs to ensure they are meeting SLAs
  • Continuously improve API performance and scalability
  • Document APIs and provide support to API users

Requirements:


  • Bachelor's degree in Computer Science, Software Engineering or related field
  • Minimum of 3 years of experience in designing and developing APIs using Apigee
  • Strong knowledge of RESTful API design principles
  • Experience with API security and authentication protocols (OAuth2, JWT, etc.)
  • Understanding of API management tools and concepts (API Gateway, API Proxy, etc.)
  • Familiarity with API monitoring and analytics tools (Apigee Analytics, Splunk, etc.)
  • Excellent problem-solving and analytical skills
  • Strong communication and collaboration skills

As a Senior DevOps Engineer with a strong emphasis on DevOps practices, you will lead the charge in optimizing our software development and operations processes. Your primary objective is to elevate our DevOps capabilities by focusing on key areas such as builds – the pipeline, automation, and code quality, all while incorporating vital security components. An Ideal candidate will possess a deep understanding of security from a DevOps perspective. The person must be able to identify gaps, bring the DevSecOps framework to the next maturity level while seeing the bigger picture by effective use of automation, coordinating with cross functional team and enabling the industry best practices within the team.

Key Responsibilities:


  • Spearhead the design, implementation, and maintenance of efficient CI/CD pipelines and automation frameworks, with a foundation in DevOps best practices.
  • Identify areas for improvement and implement automation solutions to streamline critical processes.
  • Collaborate closely with development and operations teams to seamlessly integrate security into the software development lifecycle.
  • Champion code quality and sanity checks, emphasizing early detection and remediation of issues.
  • Incorporate security tools such as Sonarqube, Data Theorem, Veracode, and BurpSuite into the DevOps workflow to enhance code security without compromising agility.
  • Actively monitor and respond to security-related concerns within the DevOps pipeline.
  • Foster a security-conscious culture by providing guidance and promoting security awareness among development and operations teams.
  • Stay informed about evolving security practices within the DevOps landscape.

Qualifications:


  • Bachelor’s degree in computer science or a related field (Master's preferred).
  • 6+ years of experience as a DevOps Engineer with a strong focus on DevOps fundamentals.
  • Proficiency in scripting and automation, including Python, Shell scripting, and DevOps toolsets.
  • Hands-on experience with DevOps tools like Docker, Kubernetes, Jenkins, and GitLab CI/CD, with an emphasis on code quality, automation, and efficiency.
  • A solid understanding of security concepts from a DevOps perspective, including secure coding practices.
  • Familiarity with security tools and practices, such as Sonarqube, Data Theorem, Veracode, and BurpSuite, as integral parts of the DevOps workflow.
  • Exceptional problem-solving skills and meticulous attention to detail.
  • Strong communication and collaboration abilities.
  • A commitment to staying current with evolving DevOps and security trends and technologies.

In this role, you will take the lead in enhancing our DevOps capabilities, ensuring that our software development processes are efficient, secure, and optimized for quality. By balancing DevOps excellence with a security-conscious mindset, you will help us deliver reliable and secure software solutions.

Minimum Qualifications:


  • BE/BS, with 10+ years of experience or ME/MS with 9+ years of experience in software development
  • Sound knowledge of Database Design patterns, OOPS concept, RDBMS concept
  • Should have strong and fully hands-on experience on Maria DB
  • Optimization of queries, performance tuning
  • Ability to debug DB core and analyse locks
  • Administration of Maria DB, MySQL databases on Unix
  • Excellent troubleshooting skills
  • Install, Upgrade and Patching of databases
  • Understanding of backup and restore
  • Good understanding of cloud RDS
  • Commitment to quality and high standards
  • Excellent communication skills

The ideal candidate for this role will have strong Core Java skills and a solid understanding of building, testing and deploying microservices. Primary responsibilities will include design, develop and test globally deployed cloud based microservices solutions with high availability. Be responsible for current software development practices and principles to identify and implement process improvements. Work with micro-services teams on RESTful API designs - assist with future scripted API's and websocket investigations.

Key Skills:


  • Mandatory - Java 8 - 11, RDBMS concepts, Spring/SpringBoot, Rest APIs, Web Sockets, SOA, Microservices, AWS EC2, Lambda, S3, Docker containers
  • Nice to have - React JS, Redux, CSS or Angular JS

Technical Competency:


  • Strong Core JAVA Skills, Microservices, Rest API, Spring, Hibernate, Web sockets, CICD.
  • Solid experience with SQL/NoSQL and cloud-based technologies.
  • Good experience with manual and automation testing with tools like Selenium.
  • Ability to work independently or as part of a larger global development team in agile.
  • Willingness to learn new technologies and demonstrate commitment to excellence for the continuous improvement of our products, code base, processes, and tools.
  • Use of test management and bug management tools like Zephyr and JIRA.
  • Strong knowledge on CI tools like Jenkins to align with Dev-Ops Deployment process.
  • Experience with Agile or Scaled Agile Framework (SAFe) work environments
  • Understanding and experience of BDD/TDD strengths and weaknesses is a plus.
  • Experience and Interest in Javascript, Html and CSS will be an advantage and bonus.

Prefered Skills:


  • Designing a modern highly responsive web-based user interface
  • Strong and hands-on experience in React JS, Redux, HTML, modular CSS, JavaScript, libraries, components.
  • Familiarity working with REST APIs for deep integrations with platforms
  • Experience with automated testing suites, like Jest or Mocha
  • Should understand principles of mobile development
  • Should work closely with our product, design, and UX teams to create amazing and intuitive experiences that make it effortless to connect different apps together.

It's Hybrid model - 2 or 3 days Work from the office in a week

Key Skills:


  • C/C++
  • Linux
  • opps
  • Data structures
  • Multithreading
  • Networking
  • Telecom

Job duties:


  • Develop, evangelise, and enforce enterprise data standards across TVS group companies.
  • Liaise with different stakeholders in converting the data management strategy and policies to detailed standards, guidelines, and design patterns for managing data across its lifecycle.
  • Take ownership of building required checklists enforcing the data standards across SDLC.
  • Document various design patterns around data modeling, data storage, security, movement, integration, retention, transformations, consumption, and purge covering different use cases (BPM applications, OLAP applications, data engineering, master data management, real-time use cases, data migrations, etc.)
  • Be a go-to resource for data management and provide advice to application development teams on an as-needed basis.
  • Create enterprise data models (conceptual, and logical) for consumption across the enterprise.

Skills and Experience:


  • 10-15 years of experience on data projects (data modeling, data architecture, data warehousing, data lakes, data engineering, business intelligence, etc.)
  • 7+ years of relevant experience as an enterprise data architect.
  • Must have provided data architecture to complex data projects covering entire data lifecycle.
  • Strong understanding of multi-domain MDM design patterns.
  • Extensive consulting background is a plus.
  • Must have experience in creating high-impact, detailed documents and presentations for consumption by (presenting to) audience across different levels.
  • Must have experience in cloud-based data architectures.
  • Good understanding of engineering processes like Release Management, engineering, and enterprise BOM Management/Change Management
  • Certification in enterprise architecture frameworks such as TOGAF, Zachman is a plus.
  • Experience in creating architecture document, detailed design, recommendations, etc. with a top-down approach.
  • Experience in manufacturing or automotive domain is a plus.
  • Must have strong written and verbal communication skills.

Job duties:


  • Understand the current master data sources and application, related data quality issues, and data consumption issues and provide a scalable MDM solution architecture
  • Lead the technical implementation of a MDM tool and ensure that a single source of truth is available for data consumers
  • Profile the data from different master data sources, collaborate with different business and IT users to understand the current-state pain points and establish implementation success criteria
  • Provide daily status reports to supervisor
  • Develop a master data model that will cater to all data consumers
  • Enrich the master data with other data to add more context to the master data
  • Provide high-level and low-level design on mast data pipeline (sourcing, cleansing, standardizing, matching, merging, exception workflow, master data publishing, etc.)
  • Help data project manager in mitigating the risks in MDM implementation
  • Design a report on MDM operational metrics

Skills and Experience:


  • Batchelors degree in any discipline
  • 8+ years of experience developing and deploying MDM solutions using different implementation styles
  • At least 5 years of experience as an MDM Architect
  • Strong understanding of multi-domain MDM design patterns.
  • Experience in automotive industry
  • Hands-on experience in implementing any of the popular MDM tools (Informatica, Ataccama, Profisee, Syndigo, Talend, etc.)
  • Experience training the data stewards on responding to the MDM workflow exceptions
  • Ability to work in an agile, dynamic environment
  • Self-starter with an ability to work with minimal or no direction
  • Good verbal and written communication skills, and presentation skills

Job duties:


  • Connect to different data sources and harvest technical metadata (data at rest and data in motion).
  • Coordinate with Business and IT stakeholders and enrich the data catalog with additional metadata (business terms, attribute descriptions, data classifications, data domain, data owner, data steward, process, system, regulation, policy, etc.)
  • Validate the metadata with relevant stakeholders before publishing for production use.
  • Build data quality rules for critical data elements and validate them with relevant stakeholders.
  • Train data stewards and other business, IT stakeholders on how to use the metadata management tool
  • Create a standard process to gather business and technical metadata and train stakeholders
  • Provide weekly status reports on the metadata harvesting and enrichment progress to relevant stakeholders.

Skills and Experience:


  • 6-10 years of overall experience working on any data related products.
  • 4 years of relevant experience in data governance projects (more specifically related to data cataloging, and metadata management).
  • Must have experience using any popular data cataloging and metadata management tool to harvest and enrich metadata
  • Data quality experience (data quality rules configuration, data profiling, scorecarding, etc.) is a big plus.
  • Good verbal and written communication skills.
  • Must be a self-starter and be able to work independently.
  • Must be customer-facing.
  • Must be hands-on.

Job duties:


  • Design ML and DL algorithms based on the requirement.
  • Assist with building a feature store for improving reusability across the enterprise.
  • Assist with building an end-to-end MLOps framework and a repeatable process preferably on Azure platform for all data science and AI teams to use.
  • Demonstrate the use of feature store and MLOps pipeline with the help of two to three machine learning and deep learning use cases.
  • Build, test and demonstrate a repeatable process with custom or external tools for unstructured data annotation.
  • Create documentation and training material on the frameworks and processes developed.

Skills and Experience:


  • 7-10 years of overall experience in data science and machine learning.
  • 4+ years of building ML models, engineering and building automated processes to test, operationalize, and monitor the models.
  • Experience in building and building a feature store/registry is must.
  • Experience in unstructured data annotation is required.
  • Experience in Databricks platform with Unity catalog for machine learning development is a must.
  • Experience in Azure ML tool stack is a plus.
  • Hands-on and expert-level experience in using frameworks such as TensorFlow, Keras, Scikit-Learn, PyTorch, etc.
  • Good verbal and written communication skills.
  • Must be a self-starter and be able to work independently.
  • Must be customer-facing.
  • Must be hands-on.

Responsibilities:


  • Collaborate with clients and functional consultants to gather business requirements and translate them into technical specifications for Oracle Extensions VBCS solutions.
  • Design and develop custom extensions and integrations using Oracle VBCS, leveraging visual development tools, JavaScript, HTML, CSS, and other relevant technologies.
  • Customize and extend Oracle applications, modules, and workflows using VBCS to meet specific business needs.
  • Develop data models, create REST APIs, and configure integrations to connect Oracle VBCS with other systems and databases.
  • Perform testing, debugging, and troubleshooting to ensure the quality, performance, and security of Oracle Extensions VBCS solutions.
  • Collaborate with cross-functional teams to ensure smooth integration with existing systems and data sources.
  • Provide technical guidance and expertise to clients and project teams throughout the implementation lifestyle.
  • Stay updated with the latest Oracle VBCS features, enhancements, and best practices for rapid application development.
  • Document technical specifications, configurations, and procedures related to Oracle Extensions VBCS solutions.
  • Assist in the migration and deployment of Oracle Extensions VBCS applications to production environments

Mandatory :


  • Experience in developing data models using DB/working with REST and SOAP Web services
  • Basic knowledge on JavaScript, HTML, CSS
  • VBCS custom page development experience

Responsibilities:


  • As a technical team member work with the Offshore / US project team / client on engagements focussed on Oracle EBS / Cloud application. Understand customer business processes / Functional Specification and be able to prepare technical design. Provide Technical support for Oracle Cloud integration, data conversion and reports. Develop and unit test technical components as per PwC standards. Participate in automation and digital transformation activities within or outside the client projects. Desired Knowledge : Good knowledge of modules and processes around Oracle Finance/SCM application. End to end implementation experience in oracle cloud / ERP applications. Excellent communication skills and ability to interact with external teams or clients. Experience in working with client / USA counterparts in understanding their business requirements and providing the right solutions

Must have skills:


  • Candidate should possess strong knowledge in below 3 or more areas: SQL and Pl/SQL Data migration using FBDI Oracle SaaS BI / OTBI / FR reports Cloud Integration ( ICS / OIC) Oracle VBCS / APEX

Good to have skills:


  • Knowledge on emerging technologies like ( RPA, IoT, Blockchain)

As a technical team member work with the Offshore / US project team / client on engagements focussed on Oracle EBS / Cloud application.

Desired Knowledge :


  • Good knowledge of modules and processes around Oracle Finance/SCM application.
  • End to end implementation experience in oracle cloud / ERP applications.
  • Excellent communication skills and ability to interact with external teams or clients.
  • Experience in working with client / USA counterparts in understanding their business requirements and providing the right solutions.

Must have skills:


  • Candidate should possess strong knowledge in below 3 or more areas:
  • SQL and Pl/SQL
  • Data migration using FBDI
  • Oracle SaaS BI / OTBI / FR reports
  • Cloud Integration ( ICS / OIC)
  • Oracle VBCS / APEX

Responsibilities:


  • As a conversion lead work with the Offshore / US project team / client on engagements focused on Cloud application and able to drive different teams towards data conversion schedule defined in project plan. Understand customer business processes / Functional Specification in the area of Finance data conversions and be able to prepare technical design. Provide Technical support for data conversion, develop and unit test technical components as per PwC standards. Desired Knowledge: Good knowledge of modules and processes around Oracle Financials modules and Oracle Supply chain modules. End to end implementation experience in oracle cloud / ERP applications. Understand conversions end to end process and steps. Excellent communication skills and ability to interact with external teams or clients. Experience in working with client / USA counterparts in understanding their business requirements

Must have skills:


  • Candidate should possess strong knowledge in below 3 areas: SQL and Pl/SQL Data migration using FBDI, ADFDI, REST and SOAP APIs understanding in SaaS

Good to have skills:


  • Alteryx, Jira and Talend

Job Requirements:


  • Candidate should have min 4+ years of exp in Testing.
  • 3+ years in C# Automation
  • 2+ Years in Specflow, BDD.

About You :

We’re looking for high achieving full-time Staff AI Engineer to join our engineering team. Someone who has interest in and a good understanding of devops and wants to help design, implement, launch, and scale major AI/ML systems and user-facing features. You're comfortable working in a fast-paced environment with a small and talented team where you're supported in your efforts to grow professionally. You're able to manage your time well, communicate effectively, and collaborate in a fully distributed team. Our backend tech stack currently consists primarily of Python Flask web apps. Our data stores include MongoDB, Neo4J and Redis. We utilize AWS EventBridge for our integrations and design loosely coupled applications. Our stack includes GPT integrations and data processing pipelines supporting generative ai applications. The underlying infrastructure runs on AWS using a combination of managed services like EKS and non-managed services running on EC2 instances. All of our compute runs through CI/CD pipelines that build Docker images, run automated tests, and deploy to our clusters in AWS. Our backend primarily serves a well-documented API that our front-end Salesforce App and Web app consumes. Our infrastructure is automated using Terraform and other AWS tools.

Key Responsibilities:


  • Conceive, design, build, and launch intelligent features which will drive social impact in the world for the causes we all care about.
  • Design and implement data integration, acquisition, cleansing, harmonization, and transformation processes to create curated high-quality datasets for data science, data discovery for the usage of vector stores/embeddings.
  • Develop and maintain scalable data processing pipelines and systems to support Generative AI applications
  • Monitor and optimize the performance and manage the costs of data processing pipelines and systems
  • Monitor technology trends and advancements in Generative AI and incorporate them to continuously innovate
  • Collaborate with Solution Architects, Data Scientists, Software Engineers and DevOps Engineers, Product Owners, researchers and business stakeholders on the cross-functional team and across teams to understand business needs, derive technical requirements and ensure data availability, quality and responsible data use adhering to security, privacy and compliance requirements
  • Continuously improve data acquisition, preparation, transformation, and publishing processes to meet business needs

Requirements:


  • Bachelor’s Degree in Computer Science or similar discipline.
  • 7+ years of experience in data engineering, with 6 months of hands-on experience in Generative AI technologies like vector databases
  • Hands-on experiences on text processing tools (e.g. spaCy, NLTK, Word2vec, pyTorch)
  • Strong knowledge of network security principles and best practices.
  • Experience with security models and development on large data sets
  • Experience with at least one cloud platform like Azure, AWS, GCP
  • Proficiency with data engineering technologies and tools (e.g. Hadoop, Airflow, Pandas, etc.)
  • Hands-on experience with monitoring, management, scalability and automation of data processing pipelines and systems
  • Hands-on experience in software development with one major programming language (e.g. Python)
  • Excellent communication skills, with the ability to explain complex concepts in simple language
  • Hands-on experience in machine learning, especially deep neural networks and Generative AI models
  • Understanding the difference, advantages, and disadvantages between the most common large language models (GPT, Llama, etc.)
  • Experience in developing end-to-end production-grade solutions with cloud and AI technology
  • Understanding of best practices for inclusion and maintenance of document stores in retrieval augmented generation strategies
  • Hands-on experiences on information retrieval tools (e.g. ChromaDB, Pinecone, Pgvector, Elasticsearch or other vector stores)
  • Passion for learning, innovation and staying current with industry trends in AI and technology
  • Good overview of re-usable frameworks and tools in the field of Generative AI (both commercial and open-source)
  • Comfortable solving ambiguous problems and adapting to a dynamic environment
  • Relentless with best practices and willing to discuss the choices you make with your fellow engineers and manager.

Bonus points if you have:


  • Contributed open source code related to our tech stack.
  • Led small project teams building and launching features.
  • Built B2B SaaS products.
  • Worked with complex architectures that support multiple APIs (e.g. REST, GQL, WebSockets) as well as async task and event processing frameworks.

Benefits:


  • Competitive wages.
  • 10-days of accrued paid vacation that increases with tenure.
  • 8-days of paid sick leave annually.
  • 10-additional paid company holidays.
  • Medical, dental, vision & life insurance options.

Salary Offer:


    We determine your level based on interview performance and make an offer based on geo-located salary bands. During the hiring process, we review the base salary, benefits, and number of options. Please keep in mind that any equity portion of an offer is not included in these numbers and can represent a significant part of your total compensation.

Job duties :

  • Build an automotive customer/prospect database with key contact information.
  • Collaborate with different data providers and ensure that good quality data feeds are established.
  • Liaise with Data Engineering teams in ensuring that the data coming from third parties is integrated and automated to continuously update the prospect database.
  • Liaise with Marketing teams to conduct roadshows and events if required.
  • Gather prospect data from Marketing teams on the events and roadshows they conducted.
  • Establish data quality checks in place to ensure the prospect data is of 100% good quality.
  • Provide daily status reports to the supervisor.

Skills and Experience:


  • 5+ years of experience in building prospect databases.
  • Experience in Marketing products, persuading prospects to provide information and potentially buy products.
  • Must have good verbal and written communication skills.
  • Must be a self-starter and go-getter.
  • Must be able to work independently with very minimal direction and hand-holding.
  • Must be data-savvy.
  • Must have flair for meeting new people and building connections.

Job duties :

  • Gather the requirements for a Customer master data workflow, design a solution, develop, test, and deploy.
  • Code Azure Databricks/Spark scripts to match and merge duplicate customer data based on predefined matching rules (deterministic and fuzzy logic).
  • Develop a user interface and a configurable workflow to review the potential customer records that could be reviewed and merged manually.
  • Provide daily status reports to supervisor.

Skills and Experience:


  • 7+ years of experience in Java full-stack development.
  • 2+ years of experience in any workflow development.
  • 4+ years of experience in Azure Databricks/Spark development.
  • Must have good verbal and written communication skills.
  • Must be a self-starter and go-getter.
  • Must be able to work independently with very minimal direction and hand-holding
  • Must be data-savvy.

Job duties :

  • Should have 3 to 8 years of experience in implementing Oracle Transport Management and Global Trade Management modules.
  • Having experience of leading at least 2 end to end implementation in Oracle Cloud.
  • Act as Oracle Cloud domain expert providing best-practice guidance on supply chain business processes and implementation approaches.
  • Understand business requirements and be able to convert into system configurations in supply chain modules and bring in diverse perspectives
  • Proven ability business process areas exposure to Inventory to Deliver cycle
  • Exposure to SCOR and APQC frameworks is a plus.

Job description:

    We are looking for a skilled and knowledgeable Oracle Integration Technical Consultant to join our team. As an Oracle Integration Technical Consultant, you will be responsible for designing, developing, and implementing Oracle Integration solutions using Oracle Integration Cloud Service (OICS). You will collaborate with clients, project managers, functional consultants, and other technical team members to deliver high-quality integrations to support Oracle applications

Mandatory skills:

  • Design and Development of OIC Integrations in at least one end-to-end implementation project in OIC to Oracle ERP Cloud including extracting Oracle ERP Cloud data using BI Publisher reports.
  • Design and develop integrations in OIC to Oracle ERP Cloud including data load using FBDI using Oracle ERP Cloud data using BI Publisher reports.
  • Hands-On Experience of XSLT transformations.
  • Hands-On Experience in integration methods i.e. SOAP and Rest Web Services, FBDI, and ADF DI.
  • Basic knowledge in the development of packages and functions using SQL/PLSQL and exposing them as REST using ORDS.
  • Good Knowledge of building custom ESS jobs.
  • Hands-on with development and unit testing of integration components and web services (SOAP/REST) using OIC.
  • Working knowledge in integration with Autonomous databases i.e. ATP/ADW will be an advantage.

Responsibilities:

  • Design and Development of OIC Integrations in at least one end-to-end implementation project in OIC to Oracle ERP Cloud including extracting Oracle ERP Cloud data using BI Publisher reports.
  • Design and develop integrations in OIC to Oracle ERP Cloud including data load using FBDI using Oracle ERP Cloud data using BI Publisher reports.
  • Hands-On Experience of XSLT transformations.
  • Hands-On Experience in integration methods i.e. SOAP and Rest Web Services, FBDI, and ADF DI.
  • Basic knowledge in the development of packages and functions using SQL/PLSQL and exposing them as REST using ORDS.
  • Good Knowledge of building custom ESS jobs.
  • Hands-on with development and unit testing of integration components and web services (SOAP/REST) using OIC.
  • Working knowledge in integration with Autonomous databases i.e. ATP/ADW will be an advantage.

Responsibilities:

  • Collaborate with clients and functional consultants to gather business requirements and translate them into technical specifications for Oracl
  • Experience in Designing and developing integrations using both inbound and outbound approaches
  • Perform testing, debugging, and troubleshooting to ensure the quality, performance, and security of Oracle Integration solutions.
  • Collaborate with cross-functional teams to ensure smooth integration with existing systems and data sources
  • Provide technical guidance and expertise to clients and project teams throughout the implementation lifecycle.
  • Stay updated with the latest Oracle Integration features, enhancements, and best practices for rapid application development.
  • Stay updated with the latest Oracle Integration features, enhancements, and best practices for rapid application development

Qualifications:

  • Bachelor’s degree in computer science, Information Systems, or a related field. Relevant certifications in Oracle development and Oracle VBCS are a plus.
  • Proven experience as an Oracle Integration Consultant, with a strong understanding of Oracle Fusion Applications.
  • In-depth knowledge of Oracle Integration and its corresponding development approaches for various types of patterns like File/App/Scheduled/Event based.
  • Proficiency in XSLT, JavaScript technologies.
  • Familiarity with Oracle Fusion Applications and their underlying data models.
  • Strong problem-solving and debugging skills to identify and resolve technical issues.
  • Ability to work independently and collaboratively in a team environment.
  • Excellent communication and interpersonal skills to effectively interact with clients and team members.
  • Experience with full software development lifecycle (SDLC) methodologies, including requirements gathering, analysis, design, development, testing, and deployment.

Skills:

  • Possess strong design and implementation experience in Oracle JET, VBCS, OCE/OCM, Chatbots.
  • Development experience in at least one full life-cycle project using JET, VBCS or OCE products
  • Experience and understanding of web services, XML technologies (WSDL, SOAP, API, REST), OIC/PCS
  • Experience with Web Services and interacting with databases (Oracle, SQL Server, MySQL)
  • Experience with IDCS, SSO, CI/CD, DevOps will be added advantage.

Additional Skills for Senior Developers:

  • • Advise and influence customers on process direction/decisions.
  • • Ability to apply design patterns and architectural principles

Soft Skills Required (across experience levels):

  • Strong consulting skills to work in a professional manner with customer IT and business users
  • Provide Technical support for Oracle Cloud integration, data conversion and reports.
  • Participate in automation and digital transformation activities within or outside the client projects

Desired Knowledge :

  • End to end implementation experience in oracle cloud / ERP applications.
  • Excellent communication skills and ability to interact with external teams or clients.
  • Experience in working with client / USA counterparts in understanding their business requirements and providing the right solutions.

Must have skills:(Candidate should possess strong knowledge in below 3 or more areas)

  • SQL and Pl/SQL
  • Data migration using FBDI.
  • Oracle SaaS BI
  • cloud Integration (ICS / OIC)
  • Oracle VBCS / APEX

Job Description :

  • Python Concepts Basic Python: Solid understanding of Python programming language, including data types, functions, loops, conditional statements, and file I/O operations
  • Object-Oriented Programming (OOP): Proficiency in OOP concepts such as classes, objects, inheritance, polymorphism, and encapsulation.
  • Exception Handling: Handling exceptions and errors effectively in Python.
  • Data Structures: Familiarity with Python data structures like lists, dictionaries, tuples, and sets
  • Control Structures: Mastery of loops, conditionals, and functions for effective data processing.
  • String Manipulation: Utilizing Python's string methods for cleaning and transforming text.
  • List Comprehensions: Using concise methods to manipulate and transform lists.
  • Set Operations: Leveraging sets to remove duplicates or perform set-related operations efficiently.
  • Pandas: Employing Pandas for data manipulation, cleaning, and transformation.
  • NumPy: Utilizing NumPy for numerical operations and array manipulations. Other Libraries: Depending on the task, familiarity with libraries related to data processing and manipulation can be advantageous. Squish Framework Squish
  • IDE: Understanding how to navigate and utilize the Squish IDE for recording, creating, editing, and executing test scripts
  • Squish Test Scripting: Writing test scripts in Python for GUI testing with Squish, including interactions with UI elements and verification of expected behaviour.
  • Object Map: Proficiency in creating and managing object maps, which define how Squish recognizes and interacts with the application's GUI elements.
  • Scripting for Different Platforms: Ability to write test scripts for various platforms, including desktop, web, mobile, and embedded systems.
  • Debugging and Troubleshooting: Proficiency in debugging and troubleshooting issues in automated tests.
  • Knowledge of integrating Squish tests into Continuous Integration/Continuous Deployment pipelines.
  • Skills in generating test reports and analysing test results.
  • Familiarity with version control systems (e.g., Git) for managing test scripts and test suites. Adaptability and Learning Openness to learning new features and updates within the Squish framework.

Primary skills :

  • Hands-on experience in big data systems such HDFS, Presto, OpenShift etc.
  • Telco background with protocol knowledge - SS7, MAP, DIA, VoLTE and 5G

soft skills :

  • Analytical, Systems, Logical thinking, Good written and spoken communication skills.
  • Stakeholder Management, Problem Solver, Self-starter, Go-getter attitude

Technical skills :

  • The data quality manager will be responsible for ensuring the quality of data by establishing data standards and policies, and ensuring adherence to these standards with a focus on improvement of data quality and trust
  • Work collaboratively with cross-functional teams, including engineering, operations, and business groups to ensure that all quality initiatives are aligned with organisations overall business objectives
  • Responsible for defining and implementing data quality checks and data reconciliation rules in partnership with product owner, technical data architects and data engineers to ensure data integrity
  • Ensure that all quality-related documentation is complete and accurate, including specifications, procedures, and process flows
  • Partner with data owners and data stewards to develop workflows, dashboards, and automation of data governance activities within our data governance tools
  • Perform functional, regression, usability tests, etc. as applicable for both new and existing datasets, systems, reports, and dashboards
  • Deep understanding of data quality management, data governance & standards, and associated technologies
  • Proven experience in establishing and operationalizing data quality management capabilities in a complex, global organization.
  • Experience with Informatica Data Quality, knowledge of SAP MDG as a Data Quality tool
  • Ability to write SQL queries to conduct data validations and probe quality issues with data
  • Ability to develop technical expertise and understanding of system interdependencies and their impacts on Test Data
  • Strong experience in Project Delivery Methodologies such as SDLC, Iterative, Agile, Scrum, Kanban
  • Ability to abstract technical details and effectively communicate to audiences at different levels
  • Excellent listening, communication, and presentation skills. Strong organizational and documentation skills
  • Ability to develop and maintain relationships with internal partners. Proficient at balancing multiple efforts at a time.
  • Ability to navigate through ambiguity to clarify objectives and execution plans
  • Exemplify flexible thinking, curiosity, self-motivation, dedication to quality, focus on the details, and a positive perspective.

Duties and Responsibilties :

    Incident Coordinator must be able to handle various tasks in order to successfully meet all of the demands of this job. The following are core Incident Coordinator duties and responsibilities.

Create Incident Report:

    Keeping a log of incidents is an important task of Incident Coordinator. This not only helps them to keep track of any issues and ensures resolution, but also assists them in examining incidents and establishing processes to help prevent or minimize similar problems from arising

Implement Effective Procedures :

    An Incident Coordinator will develop procedures and policies by which technical support teams will operate. These processes will be applied to help in such areas as service failures and cyber security threats. They will also train support engineers

Incident Coordinator Skills:

    In order to successfully complete all tasks, an Incident Coordinator needs to possess strong problem solving, analytical and time management skills. They should also be able to apply organizational, critical thinking and oral and written communication skills. An Incident Coordinator will be an effective team player and leader who can work independently when necessary. Paying attention to detail and handling crisis situations are also important traits for Incident Coordinators.

Core skills :

  • Coordination on bridge calls across teams, groups, sections, departments.
  • Maintaining incident logs and processing incident reports for review with upper management
  • Solving complex problems with information technology software and hardware
  • Providing training for technical support teams
  • Developing procedural manuals for various IT issues

Advanced skills :

  • Ability to handle and perform in stressful situations
  • Proficiency in Microsoft applications such as Word, PowerPoint and Excel

Flow :

  • Incident Logging (Phone calls, Emails, Live Chats etc)
  • Incident Coordination (Taking Notes on Bridge calls, RCA coordination)
  • Ticket Creation (Incident, Service Requests)
  • Incident Categorization / Prioritization (Critical, High, Medium, Low)
  • Incident Resolution
  • Incident Closure

Other Skills :

  • Hands on experience on Linux variants
  • Exposure to public cloud technologies and virtualization
  • Understanding of Jenkins CI/CD pipelines
  • Familiarity with monitoring tools like Grafana, Prometheus, Kibana etc
  • Basic troubleshooting skills on network (TCP/IP, DNS, DHCP, Routing, Switching, Traceroute, TCPDump, Wireshark etc)
  • Capable of self-learning on job
  • Should be ready to work in Rotational shift including night shift

Skills :

    J2EE, SCJP, Java, Spring framework, Hibernate, JPA, GIT, CI/CD Jenkins, REST Microservices, SprintBoot, Data structures and Algorithms and oops and Multithreading And node or angular

Responsibilities :

  • Responsible for building and maintaining stable products with latest tech-stack, frameworks and following Agile and DevOps ways-of-working
  • Accountable for delivering quality solutions to the problems without compromising on the planned deadlines. Comfortable leading a team of developers
  • Opportunity to transform by redesigning and enhancing applications to latest technologies
  • Maximize the opportunity to excel in an open and recognizing work culture. Be a problem solver, leader, and a team player to make a bigger contribution to the achievements.
  • Play a part in every aspect of the software development lifecycle, including software design, development, testing, reviewing, and deployment
  • Open to learn from each other in the team and each experience day-to-day

QUALIFICATIONS :

  • BE/BTech/MCA/MCS from a reputed institute
  • Have at least 2+ years of experience with two or more development languages. Experience working on Python is a MUST.
  • Experience in writing relatively complex DB queries (any relational DB) is a MUST.
  • Experience in building RESTful APIs using Python and web frameworks such as Flask is a MUST
  • Experience in working in any of the ORM tools like SQLAlchemy, Django is a MUST
  • Working knowledge on cloud services of AWS (or Azure) is highly desirable
  • Experience in building applications with containerization like Docker is nice to have
  • Experience in building applications with containerization like Docker is nice to have
  • Skillful in writing high-quality, well-tested code
  • Comfortable with Agile methodologies, such as Scrum, Kanban
  • Key competencies required: Problem-Solving, Analytical, Collaboration, and Accountability
  • An influencer by always advocating for technical excellence and innovation while being open to change when needed
  • Resilient in ambiguous situations and can approach challenges from multiple perspectives
  • Efficiently utilize DevOps tools and practices to build and deploy software
  • AWS® certifications would be an advantage

Job Description :

  • Hand-on Design, Build, Deploy
  • Extensive experience in automation using SAP
  • Database, file and excel integration
  • Work-queue
  • Experience in solutioning

Skills :

  • Bigdata, HDfC, Hadoop, Kubernetes
  • Hands-on experience in big data systems such HDFS, Presto, Openshift etc.
  • Telco background with protocol knowledge - SS7, MAP, DIA, VoLTE and 5G.

Skills :

  • Ui Path
  • Power Apps
  • Power Automate
  • ability to pick up new technologies

Skills :

  • Experience in .Net based full stack development
  • Power pages
  • API Creation
  • Ability to pick up UiPath or similar technologies

Skills :

  • Bachelor's degree in Computer Science, or with equivalent professional experience.
  • Exceptional problem-solving skills
  • Self-motivated
  • Minimum of 3 years of hands-on BI experience on SQL, SSAS tabular Models
  • 5+ years of experience in PowerBI Reports, Data Modelling, M Query, DAX.
  • Capable of implementing row-level security on data along with an understanding of application security layer models in Power BI.
  • Expert in using advanced-level calculations on the data set.
  • Expert-level knowledge of using SSIS.
  • Knowledge or experience in python development especially in pySpark in AWS Cloud environment
  • Desire to have AWS Solution Architect Certification
  • Good to have Power Automate / Power App knowledge, Tableau.
  • Responsible for design methodology and project documentaries
  • Should be able to develop tabular and multidimensional models that are compatible with data warehouse standards.
  • Experience in an Agile environment
  • Experience with Git and VSTS
  • Excellent written and verbal communication skills

Skills :

  • Deliver automation/business transformation/process improvement projects individually or with limited guidance and (preferably) lead small workstreams etc
  • Deliver automation/business transformation/process improvement projects individually or with limited guidance and (preferably) lead small workstreams etc
  • Conduct automation/improvement/opportunity assessments, be able to understand process, sub-processes, activities across functions and sectors, document the same, prepare process maps using tools such as Visio etc., understand and relate to various KPIs and process SLAs
  • Develop financial models, business cases and prioritize improvement opportunities using relevant frameworks and methodologies
  • Work within project constraints, ensure effective communication and be able to identify and communicate projects risks and issues proactively
  • Contribute ideas for problem solving, help build better solutions by leveraging different project and functional expertise, support firm initiatives as required
  • Prepare and maintain project documentation, proper archiving, and carrying out firm defined quality and compliance processes
  • Good Sector/Process/Functional expertise or exposure to GCC ecosystem/landscape is desirable
  • Experience and exposure to IT project delivery preferably in RPA automation projects involving AA, UiPath or Blue Prism technologies
  • Advanced knowledge of MS Office tools especially MS Excel, PowerPoint, Word & MS Project
  • Knowledge of VBA Macros; exposure to AI/ML technologies is desirable
  • Good communication (oral and written) skills

Skills :

  • Deliver automation/business transformation/process improvement projects individually or with limited guidance and (preferably) lead small workstreams etc
  • Deliver automation/business transformation/process improvement projects individually or with limited guidance and (preferably) lead small workstreams etc
  • Conduct automation/improvement/opportunity assessments, be able to understand process, sub-processes, activities across functions and sectors, document the same, prepare process maps using tools such as Visio etc., understand and relate to various KPIs and process SLAs
  • Develop financial models, business cases and prioritize improvement opportunities using relevant frameworks and methodologies
  • Work within project constraints, ensure effective communication and be able to identify and communicate projects risks and issues proactively
  • Contribute ideas for problem solving, help build better solutions by leveraging different project and functional expertise, support firm initiatives as required
  • Prepare and maintain project documentation, proper archiving, and carrying out firm defined quality and compliance processes
  • Good Sector/Process/Functional expertise or exposure to GCC ecosystem/landscape is desirable
  • Experience and exposure to IT project delivery preferably in RPA automation projects involving AA, UiPath or Blue Prism technologies
  • Advanced knowledge of MS Office tools especially MS Excel, PowerPoint, Word & MS Project
  • Knowledge of VBA Macros; exposure to AI/ML technologies is desirable
  • Good communication (oral and written) skills

Skills :

  • AI/ML work experience (Machine Learning, Computer Vision, NLP and LLMs)
  • Azure Cloud (Azure VM, Cognitive services, Azure functions, Logic apps etc)

Skills :

  • Strong grasp of C++.
  • Socket programming and network protocols is essential.
  • Server architecture principles, including multi-threading and scalability / Load testing expertise.
  • Familiarity with Linux and basic system administration.
  • Understanding security best practices.
  • Knowledge of continuous integration and continuous deployment practices.

Skills :

  • HTML, CSS, JavaScript , Understanding web development.
  • Java, Python, Ruby, or Node.js- Proficiency in server-side languages to develop the backend logic and APIs
  • SQL (e.g., MySQL, PostgreSQL): Managing and interacting with databases
  • TCP/IP, HTTP/HTTPS: In-depth knowledge of networking protocols
  • Android Enterprise (for Android): Familiarity with Android Enterprise APIs for managing Android devices
  • Apple-specific MDM protocol.
  • OAuth, OpenID Connect: Implementing secure authentication and authorization mechanism
  • Docker, Kubernetes: Knowledge of containerization and orchestration for deploying and managing
  • GDPR, HIPAA, etc.: Understanding and implementing compliance standards relevant to the industry and region
  • RESTful APIs: Designing and implementing APIs for communication between the server and mobile devices.
  • AWS, Azure, or Google Cloud Platform: Experience with cloud services for scalable and reliable hosting
  • Monitoring Tools (e.g., Prometheus): Implementing monitoring solutions for tracking the health and performance of servers
  • Logging Tools (e.g., ELK Stack): Capturing and analyzing logs for troubleshooting and auditing.

Skills :

  • Proficiency in SQL and PL/pgSQL.
  • Strong understanding of PostgreSQL internals, architecture, and advanced features.
  • Experience with backup and recovery tools (e.g., pg_dump, pg_basebackup).
  • Familiarity with monitoring tools (e.g., pg_stat_statements, Prometheus, Grafana).
  • Knowledge of database security practices and tools.
  • Experience with replication and clustering solutions.
  • Familiarity with Linux/Unix operating systems and shell scripting.

Good to have :

  • PostgreSQL certification (e.g., PostgreSQL CE).
  • Experience with cloud-based PostgreSQL solutions (e.g., Amazon RDS, Google Cloud SQL).
  • Familiarity with other database technologies (e.g., MySQL, MongoDB) and data warehousing concepts.
  • Experience with DevOps practices and tools (e.g., Docker, Kubernetes, Ansible).

Skills :

  • Min 6+ years of professional experience in Core Java programming
  • Collaborate with developers to understand project requirements and deliver high-quality software solutions.
  • Develop and maintain backend services using Core Java, ensuring scalability, reliability, and performance.
  • MUST have knowledge on 5G SBI Architecture, OpenAPI & 3GPP Specifications & Call Flows and 5G Network Functions.
  • Must have extensive experience in the 5G domain.
  • Strong proficiency in Core Java (Java SE) and object-oriented programming concepts.
  • Utilize version control systems like Git for managing code repositories, branching, and collaboration.
  • Comfortable working with containerization technologies such as Docker/k8s environment to package, deploy, and manage applications in a consistent and efficient manner.
  • Write clean, maintainable code and conduct thorough code reviews to ensure code quality and adherence to best practices.

Skills :

  • Min 3+ years of professional experience in Core Java programming.
  • Develop and maintain backend services using Core Java, ensuring scalability, reliability, and performance.
  • Knowledge on 5G SBI Architecture or any (SS7, SMPP, SIP) SMS Protocols.
  • Proficiency in Core Java (Java SE) and object-oriented programming concepts.
  • Utilize version control systems like Git for managing code repositories, branching, and collaboration.
  • Exposure with containerization technologies such as Docker/k8s environment to package, deploy shall be an advantage.
  • Write clean, maintainable code and conduct thorough code reviews to ensure code quality and adherence to best practices.

Skills :

  • Experience with Cypress and Selenium tools is mandatory for this role.
  • Programming knowledge in Java and Javascript is Must
  • UI and API automation knowledge

Job Description :

  • Daily processing and application of customer payments
  • Researching of issues related to unidentified payments·
  • Processing of customer refunds related to duplicate or erroneous payments.
  • Handling of credit card chargebacks and returned check deposits.
  • Processing of adjustments related to cash posting discrepancies

Skills & Qualification :

  • Graduate/Post Graduate with Accounting as a Major.
  • 6-8 years of experience in a Finance/Accounting environment in shared service industry.
  • Grasps accounting, auditing, and internal control concepts and can apply the same to CSC’s business environment
  • General knowledge of various banking transactions (ACH, Wire Transfer, Credit Card, etc.)
  • Knowledge of Microsoft Office applications and ERP(SAP/Oracle software preferably)
  • Knowledge of managing cash application with multiple foreign currency is a plus.
  • Excellent analytical and problem-solving skills, with excellent attention to the details
  • Enjoys working in a team environment.
  • Organized and able to handle multiple priorities in a continuous improvement environment.
  • Self-motivated and adapts to change
  • Exposure to web banking software
  • Provides great service to internal and external customers by making a determined effort to resolve problems and keep them from happening

Skill Set :

  • Strong project management.
  • Strong Team management.
  • Should be hands on candidate.
  • Pyspark, Data Bricks, AWS, Terra Form.

Select an option below to learn more about our end-to-end offering


Navigate through our various services