“The trajectory of Codex is thinking beyond pure engineers, it’s moving into a tool for everyone”
Our friends at Virgin Atlantic – Richard Masters and Neil Letchford launched their new mobile app in beta and saw exceptional quality with the test coverage thanks to Codex.
Follow this org to sign up: https://hf-learn.short.gy/nvB8JD
The hardest agent contest in AI just launched. Here’s how to win it.
You can now sign up to Humanity’s Last Hackathon. You build Mac Metal kernels. You use Codex from OpenAI to optimize them. You submit through Hugging Face. The fastest kernels qualify for the final battle.
What this video covers:
– The qualification task, explained
– Setting up Codex for kernel work
– Benchmarking and submitting through Hugging Face
– What it takes to climb the leaderboard and advance
Video by CNCF [Cloud Native Computing Foundation] via YouTube
Falco, the Cloud Native Runtime Security project, is constantly evolving to meet the demands of modern cloud environments. This livestream dives into the latest advancements and strategic direction of the project, with a focus on two major areas: the new Falco Operator and features that enhance performance and reliability.
The Falco Operator simplifies deployment, configuration, and management across Kubernetes clusters, making it easier to secure runtime environments at scale. The session also covers performance optimizations for high-throughput environments, community contributions, ecosystem integrations, and the upcoming release roadmap.
Plus, don’t miss out on an exclusive sneak peek at a brand new project — Prempti!
Hitesh Kamdar (Head of Capital Markets Architecture at RBC Capital Markets) discusses why open source is now a strategic differentiator for global banks. He announces the contribution of FiveSpot, RBC’s homegrown HPC orchestrator, and explores the role of automated architecture (CALM) and AI governance in modern financial engineering.
🇬🇧 Join us in London! Catch the latest on Financial AI and HPC at OSFF London on June 25, 2026: https://hubs.ly/Q041YV9Z0 (Use Code: 26YTOSFFLN20C)
🕒 Timestamps:
0:00 Welcome to OSFF Toronto
0:35 Open Source as a Strategic Differentiator
1:35 RBC’s Expanding Open Source Footprint
1:55 CALM: Automating the Architecture Lifecycle
2:45 CDM: Standardizing Financial Data Transmission
3:30 Tools for UI and Contribution: FDC3 & Get Proxy
3:55 AI Governance: Ensuring Safety and Soundness
4:15 Announcing FiveSpot: RBC’s Homegrown HPC Project
5:45 The Mission: From "Future" to "Present"
6:45 The 2030 Vision: GenAI and Regulation
7:45 Call to Action: Adopt and Contribute
📊 The Problem: The "Side Project" Perception Historically, open source in banking was often viewed as a peripheral cost-saving measure. This led to a "governance gap" where complex requirements—particularly in high-performance compute and risk calculations—were handled by fragmented, proprietary stacks that lacked the scalability required for modern capital markets.
🏗️ The Solution: Architecture-as-Code & Project FiveSpot
Hitesh outlines RBC’s evolution from a consumer of open source to a primary contributor of foundational infrastructure:
* Project FiveSpot: RBC’s first homegrown contribution to FINOS—an orchestrator for High-Performance Compute (HPC) that manages workloads across cloud and on-prem with deterministic performance.
* Architecture Transformation (CALM): Leveraging the CALM project to ensure that technical standards and architecture are as automated as code generation within the SDLC.
* Standardizing Data (CDM): Utilizing the Common Domain Model to ensure trades are transmitted internally and to regulators in a consistent, unified way.
The takeaway: Finance is no longer "going" open; it is open. Hitesh Kamdar proves that contributing back projects like FiveSpot is the key to achieving real business outcomes and financial resilience.
🌐 More about FINOS: https://www.finos.org/
📧 Join our newsletter: https://www.finos.org/sign-up
🎙️ Listen to our Open Source in Finance Podcast: https://www.youtube.com/@FINOS/podcasts
LinkedIn: https://www.linkedin.com/company/finosfoundation
Video by Open Data Science and AI Conference via YouTube
The conversation explores why today’s models can solve complex problems yet fail on simple ones, revealing the gap between generation and true understanding.
Visit our website and choose the nearest ODSC event to attend and experience all our training and workshops: https://odsc.ai
To watch more videos like this, visit https://aiplus.training
Sign up for the newsletter to stay up to date with the latest trends in data science: https://opendatascience.com/newsletter/
Follow us online!
• Facebook: https://www.facebook.com/OPENDATASCI
• Instagram: https://www.instagram.com/odsc/
• Blog: https://opendatascience.com/
• LinkedIn: https://www.linkedin.com/company/open-data-science/
• X (twitter): https://x.com/_odsc
Support us on Patreon and get an ad-free RSS feed with some early episodes. https://www.patreon.com/LateNightLinux
Hitting the limit for hard links, a parent struggles to get back into their teen’s compromised Discord account, the demise of tower PCs and general purpose computing in general, and changing the properties of existing ZFS pools.
New month, new capabilities. Get a quick tour of the April 2026 feature highlights across SAP Datasphere and SAP Business Data Cloud, with practical takeaways you can use right away.
In this update, Klaus-Peter Sauer covers what shipped in April, focusing on improvements that help teams keep data products current, monitor pipelines with more confidence, tune Spark compute, strengthen secure connectivity, and simplify day-to-day modeling and monitoring.
You will see what is new in:
✅ Update action for installed data products (SAP Business Data Cloud): Update installed data products in your SAP Datasphere space when a new minor version is available.
✅ BDC cockpit data product monitoring: Better visibility into SAP-managed data product pipelines, including where data stands and where it breaks.
✅ Local table partitioning: Partition existing SAP-managed local tables in SAP HANA Cloud spaces, even when they already contain data.
✅ Spark configuration (file spaces): Adjust compute per task activity and create custom configurations per file space.
✅ Cloud Connector support for Google BigQuery: Configure private connectivity so traffic runs through secure tunnels, supporting targets without public endpoints.
✅ Task chain deletion types: New support for delete all records and delete filtered records for local tables and local tables on files.
✅ Semantic onboarding of SAP HANA Cloud calculation views: Import calc views as remote tables while preserving semantic information.
✅ Analytic model clarity: Toggle inherited elements in the properties panel to separate inherited vs. local elements.
✅ Improved monitoring tool navigation: A cleaner monitoring menu that groups key activities in one place.
Chapters:
00:29 – Update Action for Installed Data Products
00:59 – BDC Cockpit Data Product Monitoring
01:46 – Local Table Partitioning
02:11 – Spark Configuration
03:02 – Cloud Connector Support for Google BigQuery
03:32 – Task Chains Deletion Types
04:02 – Semantic Onboarding of HANA Cloud Calc Views
04:56 – Analytic Model: Inherited vs. Local Properties
05:25 – Improved Monitoring Tool Navigation
06:02 – Summary and Outro
Learn more about SAP Datasphere: https://www.sap.com/products/technology-platform/datasphere.html
Join our SAP Datasphere community to stay up to date: https://community.sap.com/topics/datasphere
Check out our product roadmap for SAP Datasphere: https://roadmaps.sap.com/board?PRODUCT=73555000100800002141
Follow us on social:
LinkedIn: https://www.linkedin.com/company/sap/
Instagram: https://www.instagram.com/sap
Facebook: https://www.facebook.com/SAP/
Threads: https://www.threads.com/@sap
About SAP:
As a global leader in enterprise applications and business AI, SAP stands at the nexus of business and technology. For over 50 years, organizations have trusted SAP to bring out their best by uniting business-critical operations spanning finance, procurement, HR, supply chain, and customer experience. For more information, visit: https://www.sap.com/index.html
An AI agent wiped out an entire production database in 9 seconds using Cursor and Claude 4.6. Backups were stored with the database, making recovery impossible. The agent confessed to violating safety rules. #AIDataLoss #CyberSecurity #ProductionDatabase #AICopilot
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.