Hi-Tech Medicare Devices Pvt. Ltd. is leading manufacturer & supplier of disposable medical devices is looking for suitable candidates to expand its manufacturing as well as sales team for the position of Plant Head / Unit Head.
Location: Gorakhpur.
Qualification: Degree/Diploma in Engineering (Mech./ Plastic Tech.) or Science Graduate.
Experience: Minimum 10 Years in manufacturing of I.V. Cannula or similar category Medical Devices.
1 ACE DATA DEVICES JOBS
- Office AdministrationOffice Administration
About Data Axle:
Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.
Data Axle India is recognized as a Great Place to Work! This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.
General Summary:
As a Digital Data Management Architect, you will design, implement, and optimize advanced data management systems that support processing billions of digital transactions, ensuring high availability and accuracy. You will leverage your expertise in developing identity graphs, real-time data processing, and API integration to drive insights and enhance user experiences across digital platforms. Your role is crucial in building scalable and secure data architectures that support real-time analytics, identity resolution, and seamless data flows across multiple systems and applications.
Roles and Responsibilities:
- Data Architecture & System Design:
- Design and implement scalable data architectures capable of processing billions of digital transactions in real-time, ensuring low latency and high availability.
- Architect data models, workflows, and storage solutions to enable seamless real-time data processing, including stream processing and event-driven architectures.
- Identity Graph Development:
- Lead the development and maintenance of a comprehensive identity graph to unify disparate data sources, enabling accurate identity resolution across channels.
- Develop algorithms and data matching techniques to enhance identity linking, while maintaining data accuracy and privacy.
- Real-Time Data Processing & Analytics:
- Implement real-time data ingestion, processing, and analytics pipelines to support immediate data availability and actionable insights.
- Work closely with engineering teams to integrate and optimize real-time data processing frameworks such as Apache Kafka, Apache Flink, or Spark Streaming.
- API Development & Integration:
- Design and develop real-time APIs that facilitate data access and integration across internal and external platforms, focusing on security, scalability, and performance.
- Collaborate with product and engineering teams to define API specifications, data contracts, and SLAs to meet business and user requirements.
- Data Governance & Security:
- Establish data governance practices to maintain data quality, privacy, and compliance with regulatory standards across all digital transactions and identity graph data.
- Ensure security protocols and access controls are embedded in all data workflows and API integrations to protect sensitive information.
- Collaboration & Stakeholder Engagement:
- Partner with data engineering, analytics, and product teams to align data architecture with business requirements and strategic goals.
- Provide technical guidance and mentorship to junior architects and data engineers, promoting best practices and continuous learning.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- 10+ years of experience in data architecture, digital data management, or a related field, with a proven track record in managing billion+ transactions.
- Deep experience with identity resolution techniques and building identity graphs.
- Strong proficiency in real-time data processing technologies (e.g., Kafka, Flink, Spark) and API development (RESTful and/or GraphQL).
- In-depth knowledge of database systems (SQL, NoSQL), data warehousing solutions, and cloud-based platforms (AWS, Azure, or GCP).
- Familiarity with data privacy regulations (e.g., GDPR, CCPA) and data governance best practices.
This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.
Skills:- Data architecture, Systems design, Spark, Apache Kafka, Flink, Identity resolution, Identity graph, Data processing, Customer Data Platform and CDPAbout Data Axle:
Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.
Data Axle Pune is pleased to have achieved certification as a Great Place to Work!
Roles & Responsibilities:
We are looking for a Senior Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.
We are looking for a Senior Data Scientist who will be responsible for:
- Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
- Design or enhance ML workflows for data ingestion, model design, model inference and scoring
- Oversight on team project execution and delivery
- Establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
- Visualize and publish model performance results and insights to internal and external audiences
Qualifications:
- Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
- Minimum of 5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
- Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
- Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
- Proficiency in Python and SQL required; PySpark/Spark experience a plus
- Ability to conduct a productive peer review and proper code structure in Github
- Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
- Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.
It is not intended to be a complete list of assigned duties but to describe a position level.
Skills:- Machine Learning (ML), databricks, Python, SQL, PySpark and XGBoostAbout Data Axle:
Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global center of excellence in Pune. This center delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.
Data Axle India is recognized as a Great Place to Work! This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.
General Summary:
We are looking for a Full stack developer who will be responsible for:
Roles & Responsibilities:
- Implement application components and systems according to department standards and guidelines.
- Work with product and designers to translate requirements into accurate representations for the web.
- Analyze, design, code, debug, and test business applications.
- Code reviews in accordance with team processes/standards.
- Perform other miscellaneous duties as assigned by Management.
Qualifications:
· 3+ years of Software Engineering experience required.
- Bachelor’s degree in computer science, Engineering, or a related field.
- Experience in developing web applications using Django. Strong knowledge of ReactJS and related libraries such as Redux. Proficient in HTML, CSS,and JavaScript.
- Experience in working with SQL databases such as MySQL.
- Strong problem-solving skills and attention to detail.
This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.
Skills:- Python, Django and React.jsGeneral Summary:
The Senior Software Engineer will be responsible for designing, developing, testing, and maintaining full-stack solutions. This role involves hands-on coding (80% of time), performing peer code reviews, handling pull requests and engaging in architectural discussions with stakeholders. You'll contribute to the development of large-scale, data-driven SaaS solutions using best practices like TDD, DRY, KISS, YAGNI, and SOLID principles. The ideal candidate is an experienced full-stack developer who thrives in a fast-paced, Agile environment.
Essential Job Functions:
- Design, develop, and maintain scalable applications using Python and Django.
- Build responsive and dynamic user interfaces using React and TypeScript.
- Implement and integrate GraphQL APIs for efficient data querying and real-time updates.
- Apply design patterns such as Factory, Singleton, Observer, Strategy, and Repository to ensure maintainable and scalable code.
- Develop and manage RESTful APIs for seamless integration with third-party services.
- Design, optimize, and maintain SQL databases like PostgreSQL, MySQL, and MSSQL.
- Use version control systems (primarily Git) and follow collaborative workflows.
- Work within Agile methodologies such as Scrum or Kanban, participating in daily stand-ups, sprint planning, and retrospectives.
- Write and maintain unit tests, integration tests, and end-to-end tests, following Test-Driven Development (TDD).
- Collaborate with cross-functional teams, including Product Managers, DevOps, and UI/UX Designers, to deliver high-quality products
Essential functions are the basic job duties that an employee must be able to perform, with or without reasonable accommodation. The function is considered essential if the reason the position exists is to perform that function.
Supportive Job Functions:
- Remain knowledgeable of new emerging technologies and their impact on internal systems.
- Available to work on call when needed.
- Perform other miscellaneous duties as assigned by management.
These tasks do not meet the Americans with Disabilities Act definition of essential job functions and usually equal 5% or less of time spent. However, these tasks still constitute important performance aspects of the job.
Skills
- The ideal candidate must have strong proficiency in Python and Django, with a solid understanding of Object-Oriented Programming (OOPs) principles. Expertise in JavaScript,
- TypeScript, and React is essential, along with hands-on experience in GraphQL for efficient data querying.
- The candidate should be well-versed in applying design patterns such as Factory, Singleton, Observer, Strategy, and Repository to ensure scalable and maintainable code architecture.
- Proficiency in building and integrating REST APIs is required, as well as experience working with SQL databases like PostgreSQL, MySQL, and MSSQL.
- Familiarity with version control systems (especially Git) and working within Agile methodologies like Scrum or Kanban is a must.
- The candidate should also have a strong grasp of Test-Driven Development (TDD) principles.
- In addition to the above, it is good to have experience with Next.js for server-side rendering and static site generation, as well as knowledge of cloud infrastructure such as AWS or GCP.
- Familiarity with NoSQL databases, CI/CD pipelines using tools like GitHub Actions or Jenkins, and containerization technologies like Docker and Kubernetes is highly desirable.
- Experience with microservices architecture and event-driven systems (using tools like Kafka or RabbitMQ) is a plus, along with knowledge of caching technologies such as Redis or
- Memcached. Understanding OAuth2.0, JWT, SSO authentication mechanisms, and adhering to API security best practices following OWASP guidelines is beneficial.
- Additionally, experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation, and familiarity with performance monitoring tools such as New Relic or Datadog will be considered an advantage.
Abilities:
- Ability to organize, prioritize, and handle multiple assignments on a daily basis.
- Strong and effective inter-personal and communication skills
- Ability to interact professionally with a diverse group of clients and staff.
- Must be able to work flexible hours on-site and remote.
- Must be able to coordinate with other staff and provide technological leadership.
- Ability to work in a complex, dynamic team environment with minimal supervision.
- Must possess good organizational skills.
Education, Experience, and Certification:
- Associate or bachelor’s degree preferred (Computer Science, Engineer, etc.), but equivalent work experience in a technology related area may substitute.
- 2+ years relevant experience, required.
- Experience using version control daily in a developer environment.
- Experience with Python, JavaScript, and React is required.
- Experience using rapid development frameworks like Django or Flask.
- Experience using front end build tools.
Scope of Job:
- No direct reports.
- No supervisory responsibility.
- Consistent work week with minimal travel
- Errors may be serious, costly, and difficult to discover.
- Contact with others inside and outside the company is regular and frequent.
- Some access to confidential data.
- Electric Power Transmission, Control, and Distribution,Services for Renewable Energy,Information ServicesKey DeliverablesTo lead the development of all aspects of research in Electrical Power Systems Engineering specifically in the space of software defined power ie managing stranded power in a DC.Collaborate with Design and Engineering (D&E) core teams to develop, implement, and manage the DC electrical designs for new development, infrastructure upgrades and rectifications. Specific actions is to look at incorporating the latest technologies with an outcome to reduce costs and bring in specific cost savingsParticipate in new technology insertions through collaboration with the core engineering team to provide site specific requirements during the development of BOD (even for site specific stuff)Quarterly review of technologies and commercials for key aspectsModelling, simulation and data management. Experience with simulation and modelling software (e.g., MATLAB, COMSOL)Qualifications Have a Master’s or PhD’s degree in Electrical Engineering or equivalent from a recognized University. Major in Power Engineering will be advantageous.Experience in design, construction, and commissioning of medium/Low voltage /High Voltage electrical distribution systems, AC/DC systems, and associated power management/SCADA tools.Prior knowledge on power system condition monitoringInterested candidates can share their updated CV on jayashree.maheshwaran@bdxworld.comElectric Power Transmission, Control, and Distribution,Services for Renewable Energy,Information Services