Connect with us

Technology

DeepL to Debut NVIDIA DGX SuperPOD with DGX GB200 Systems in Europe

Published

on

The purchase — the biggest yet for DeepL — is among first NVIDIA DGX SuperPOD with DGX GB200 systems and will fuel DeepL’s industry-leading Language AI platform

COLOGNE, Germany, Oct. 31, 2024 /PRNewswire/ — DeepL, a leading global Language AI company, today announced it will be among the first to commercially deploy the NVIDIA DGX SuperPOD with DGX GB200 systems. The NVIDIA DGX SuperPOD, which is expected to be operational at DeepL by mid 2025, will be used to power research computation. It will provide DeepL the additional computing power needed to train new models and develop features and products to take its innovative Language AI platform—which is breaking down language barriers for businesses and professionals globally—to the next level.

“DeepL has always been a research-led company, which has enabled us to develop Language AI for translation that continues to outperform other solutions on the market,” said Jarek Kutylowski, CEO and Founder of DeepL. “This latest investment in NVIDIA accelerated computing will give our research and engineering teams the power necessary to continue innovating and bringing to market the Language AI tools and features that our customers know and love us for.”

With scalability for up to tens of thousands of GPUs, the liquid-cooled, rack-scale design of NVIDIA DGX GB200 systems includes NVIDIA GB200 Grace Blackwell Superchips, which lets DeepL run the high-performance AI models necessary for its advanced generative AI applications. This next generation of clusters is purpose-built to deliver extreme performance and consistent uptime for superscale generative AI training and inference workloads.

This marks the third deployment of a NVIDIA DGX SuperPOD by DeepL and offers more processing power than DeepL Mercury, a Top500 supercomputer — DeepL’s previous flagship NVIDIA DGX SuperPOD with DGX H100 systems, deployed a year ago in Sweden. The latest deployment will be in the same Swedish data-centre.

“Customers using Language AI applications expect nearly instant responses, making efficient and powerful AI infrastructure critical for both building and deploying AI in production,” said Charlie Boyle, vice president of the NVIDIA DGX platform at NVIDIA. “DeepL’s deployment of the latest NVIDIA DGX SuperPOD will accelerate its Language AI research and development, empowering users to communicate more effectively across languages and cultures.”

With a rapidly-growing customer network of over 100,000 businesses and governments around the world, including 50% of the Fortune 500 and industry leaders like Zendesk, Nikkei, Coursera, and Deutsche Bahn, DeepL is revolutionising global communication with its groundbreaking Language AI platform. The company’s industry-leading translation and writing tools empower businesses to break down language barriers, expand into new markets, and drive unprecedented cross-border collaboration.

This announcement is the latest in a series of big developments for DeepL in 2024. The company just unveiled a new New York tech hub, as well as updates to its Glossary feature and unveiled its next-generation large language model (LLM), which outperforms GPT-4, Google, and Microsoft for translation quality, setting a new standard for personalization, accuracy and performance. DeepL also was recently named to Forbes’ 2024 Cloud 100 list, and raised $300M of new investment at a $2B valuation in May, led by renowned late-stage investment firm Index Ventures.

About DeepL 
DeepL is on a mission to break down language barriers for businesses everywhere. Over 100,000 businesses and governments and millions of individuals in 228 global markets trust DeepL’s Language AI platform for human-like translation and better writing. Designed with enterprise security in mind, companies around the world leverage DeepL’s AI solutions that are specifically tuned for language to transform business communications, expand markets, and improve productivity. Founded in 2017 by CEO Jaroslaw (Jarek) Kutylowski, DeepL today has over 1000 passionate employees and is supported by world-renowned investors including Benchmark, IVP, and Index Ventures.

Logo – https://mma.prnewswire.com/media/2447716/DeepL_Logo.jpg

View original content:https://www.prnewswire.co.uk/news-releases/deepl-to-debut-nvidia-dgx-superpod-with-dgx-gb200-systems-in-europe-302292517.html

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology

Sifflet Achieves Google Cloud Ready – BigQuery Designation

Published

on

By

Getting the clarity needed to scale a complex data stack with Sifflet x BigQuery

PARIS, Oct. 31, 2024 /PRNewswire/ — Sifflet has successfully achieved the Google Cloud Ready – BigQuery Designation. This milestone marks a significant step in its mission to give every data producer and user clarity into the health and quality of data, so everyone can make better business decisions.

 

 

Sifflet benefits include:

Proactive Data Quality Oversight: Sifflet’s integration with BigQuery establishes an additional layer of observability and allows companies to maintain the quality of your data, detecting hidden issues and addressing them proactively before they impact business operations.Enhanced Data Accessibility: With Sifflet, data teams can easily find and utilize your BigQuery data, reducing barriers between data producers and data consumers. This enables seamless collaboration and data-driven decision-making across the organization.In-depth Insights into Data Lifecycle: Connecting a BigQuery environment to Sifflet provides additional valuable insights into the creation, transformation, and consumption of data. This holistic view enhances the ability to manage and optimize data initiatives.

This integration has been enabled through a few key features that makes the data observability experience more effective:

BigQuery Metadata Enrichment: Sifflet further enhances BigQuery metadata with automated tagging, descriptions, field-level lineage computation with upstream and downstream systems, and actionable data asset monitoring.External Table Support: Sifflet allows to monitor external BigQuery tables based on Bigtable or Cloud Storage data. This enables end-to-end lineage on Sifflet no matter where the data is located.Optimized monitoring queries: Sifflet leverages multiple BigQuery capabilities, such as partitioning and metadata APIs, to run highly optimized queries when monitoring BigQuery assets (even when querying complex types like nested and repeated fields).

Achieving the Google Cloud Ready – BigQuery Designation means that Sifflet has met essential functional and interoperability requirements for integrating with BigQuery. This designation not only validates Sifflet’s integration but also ensures that Sifflet’s products work seamlessly with BigQuery.

Throughout this program, Sifflet has collaborated closely with Google Cloud’s partner engineering and BigQuery teams, refining the integration and developing joint roadmaps for future advancements such as expanding the partitioned tables support and adding support for share tables, ML models, and OAuth connectivity.

“The integration of Sifflet with Google Cloud’s BigQuery has transformed our approach to data observability at Carrefour Links. Thanks to Sifflet’s proactive oversight, we can identify and resolve potential issues before they impact our operations.” mentioned Medhi Labassi, CTO at Carrefour Links. “Additionally, the simplified access to data allows our teams to collaborate more effectively, fully harnessing the insights provided by BigQuery.”

This collaboration opens up new avenues for innovation and growth, as Sifflet is committed to further enhancing its capabilities and delivering even more value to customers through continued collaboration with Google Cloud. Sifflet’s goal remains steadfast: to support organizations everywhere with data they can rely upon.

To learn more about Sifflet’s expertise with Google Cloud and how Sifflet can benefit your organization, visit our website. For more information about the Google Cloud Ready – BigQuery program and its benefits, click here.

Sifflet gives organizations the clarity they need to scale a complex data stack with confidence. With a focus on innovation and collaboration, Sifflet leverages the power of BigQuery to provide unparalleled data observability and performance. With a rapidly growing customer base that includes leaders like Carrefour, BBC Studios, Servier, Penguin Random House and Etam, Sifflet has proven its value by providing essential tools for data cataloging, lineage tracking, and quality monitoring. The company’s recent momentum, paired with industry recognition for best ROI and fastest implementation, highlights its strong trajectory and commitment to simplifying data operations for companies worldwide.

 

Photo: https://mma.prnewswire.com/media/2545858/Sifflet_SAS_Infographic.jpg 
Logo:  https://mma.prnewswire.com/media/2545859/Sifflet_SAS_Logo.jpg

 

 

CONTACT: Romain Doutriaux, romain@siffletdata.com 

View original content:https://www.prnewswire.com/news-releases/sifflet-achieves-google-cloud-ready—bigquery-designation-302292886.html

SOURCE Sifflet SAS

Continue Reading

Technology

Subject Line: Hunter Strategy Secures $631K Contract with NOAA for Data Flow Risk Assessment

Published

on

By

WASHINGTON, Oct. 31, 2024 /PRNewswire/ — Hunter Strategy, LLC, a HUBZone-certified small business specializing in technology and cybersecurity solutions, has been awarded a $631,054.45 contract by the National Oceanic and Atmospheric Administration (NOAA). The 36-month contract focuses on providing Data Flow Supply Chain Risk Assessment and Target Modeling services for NOAA’s National Environmental Satellite, Data, and Information Service (NESDIS).

The firm-fixed-price contract (Award ID: 1332KP24P0007) was competitively awarded under a small business set-aside, with Hunter Strategy selected among two bidders.

Kevin Belanga, Chief Growth Officer of Hunter Strategy, stated: “This contract with NOAA represents a significant opportunity for Hunter Strategy to apply our expertise in supply chain risk assessment and data modeling to support critical environmental monitoring systems. We’re excited to contribute to NOAA’s mission, enhancing the security and efficiency of their data management processes. This partnership aligns perfectly with our commitment to delivering innovative solutions that address complex challenges in the public sector.”

The project involves a various assessments of NESDIS’s data supply chain, identifying potential risks and developing targeted threat modeling strategies for NESDIS Systems. Hunter Strategy’s work will play a crucial role in enhancing the security and efficiency of NOAA NESDIS Systems. This award further establishes Hunter Strategy as a trusted partner for government agencies seeking advanced technology and cybersecurity solutions.

For more information about Hunter Strategy and its services, please visit www.hunterstrategy.net.

About Hunter Strategy:
Hunter Strategy is a HUBZone-certified small business delivering innovative technology and cybersecurity solutions to government and commercial clients. Focusing on security services, cloud solutions, and technology research, the company has built a reputation for technical excellence among federal customers and tailored services in highly regulated industries.

View original content to download multimedia:https://www.prnewswire.com/news-releases/subject-line-hunter-strategy-secures-631k-contract-with-noaa-for-data-flow-risk-assessment-302292961.html

SOURCE Hunter Strategy

Continue Reading

Technology

Follow-Up Visits Can Significantly Reduce Post-Discharge Mortality and Readmissions for Highest Risk Patients

Published

on

By

NASHVILLE, Tenn., Oct. 31, 2024 /PRNewswire/ — Today at NCQA, a leading health quality innovation conference, Houston Methodist and Health Data Analytics Institute (HDAI) jointly discuss the impact of focusing follow-up appointments on patients at high risk.

Houston Methodist and HDAI jointly discuss the impact of focusing follow-up appointments on patients at high risk

HDAI’s analysis of both Houston Methodist’s patients and a national dataset of over 10M Medicare patients demonstrated the effectiveness of targeted follow-up appointments for reducing mortality and readmissions for the top quintile of patients at risk of readmission and 30-day mortality within the first 14 days after discharge.

The data showed a distinct inverse correlation – when 14-day follow-up rates for patients increase, those patients are less likely to die post-discharge or readmit. For patients seen later than 14-days post-discharge or who are at lower predicted risk, there is small to no effect on mortality and readmissions.

Stuart Dobbs, M.D., Chief Quality Officer at Houston Methodist Hospital, states, “What we learned is another reminder that what you don’t know, you can’t measure, and what is not measured, usually is not improved. We have some of the lowest inpatient mortality rates in the country. However, our post discharge mortality rates were about average compared to large academic medical centers. We are now focused on scheduling high risk patients with a follow-up visit (within 14 days or less) prior to their hospital discharge.”  

Brenda Campbell, RN, Senior Consultant Houston Methodist Health System Innovations, added, “Risk stratification tools can assist organizations in prioritizing care based on the unique needs of patients while managing limited resources.”

HDAI’s HealthVision intelligent health platform was embedded in Houston Methodist’s Electronic Health Record (EHR) so that clinicians on the floors and in specialty clinics could identify and schedule appointments for the most impactable, high-risk patients before discharge. In addition to using the risk-ranked rosters, the clinicians use the patient chart summary to quickly assess any specific risks and key drivers of health, summarized and updated in real-time, to facilitate personalized care planning.

“By leveraging real-time EHR integration of advanced predictive analytics and generative AI, clinicians on the floors are identifying high-risk patients along with granular underlying drivers of risk to help create targeted follow-up plans,” adds Nassib Chamoun, Founder and CEO of HDAI and a co-presenter at NCQA. “It’s not about seeing more patients, which is not feasible with scarce resources, but about focusing on the right patients while also reducing the administrative burden on clinicians. Our collaboration with Houston Methodist highlights the transformative potential of data-driven approaches in enhancing patient care and optimizing health outcomes.”

The program is expanding to all the Houston Methodist hospitals with focus on continuously improving the processes and technology necessary for consistently high rates of post-discharge follow-up programs for patients with the greatest need.

About Health Data Analytics Institute (HDAI)
HDAI, a HealthTech company, has created the first Intelligent Health Management System, HealthVision™. Powered by predictive analytics and generative AI, HealthVision allows clinicians to work smarter, not harder, helping to fight clinician burnout, improve care coordination, and lower overall costs. For more information, please visit: www.hda-institute.com and on LinkedIn at linkedin.com/company/hdai.

Company contact: Carola Endicott, carola.endicott@hda-institute.com, 617-699-0725

View original content to download multimedia:https://www.prnewswire.com/news-releases/follow-up-visits-can-significantly-reduce-post-discharge-mortality-and-readmissions-for-highest-risk-patients-302292909.html

SOURCE Health Data Analytics Institute

Continue Reading

Trending