Technology
Huawei's new open source technique shrinks LLMs to make them run on less powerful, less expensive hardware
PositiveTechnology
Huawei's Computing Systems Lab in Zurich has unveiled an innovative open-source technique called SINQ, which significantly reduces the memory requirements for large language models (LLMs) while maintaining their output quality. This advancement is crucial as it allows these powerful models to run on less expensive and less powerful hardware, making advanced AI technology more accessible to a wider range of users and applications. By providing the code for this technique, Huawei is not only contributing to the AI community but also paving the way for more efficient and cost-effective AI solutions.
OpenAI's DevDay 2025 preview: Will Sam Altman launch the ChatGPT browser?
PositiveTechnology
OpenAI is gearing up for its largest annual conference, DevDay 2025, where over 1,500 developers will gather in San Francisco. This event is crucial for OpenAI as it aims to showcase its innovations and maintain its leadership in the competitive AI landscape. With the rise of new technologies, all eyes are on CEO Sam Altman to see if he will unveil a new ChatGPT browser, which could significantly enhance user experience and engagement.
New AI training method creates powerful software agents with just 78 examples
PositiveTechnology
A groundbreaking study from Shanghai Jiao Tong University and SII Generative AI Research Lab reveals that training large language models for complex tasks can be achieved with just 78 examples, challenging the notion that vast datasets are necessary. This innovative framework, known as LIMI, not only streamlines the training process but also enhances machine autonomy, making it a significant advancement in AI research. This development could lead to more efficient AI systems that require less data, ultimately accelerating the deployment of intelligent software agents across various industries.
Google's Jules coding agent moves beyond chat with new command line and API
PositiveTechnology
Google's coding assistant, Jules, is evolving beyond its chat interface to become a more integral part of developers' workflows. By introducing new command line and API functionalities, Google aims to enhance Jules' usability, making it a go-to tool for coding tasks. This shift is significant as it reflects the growing trend of integrating AI tools into everyday development processes, potentially improving efficiency and collaboration among developers.
Salesforce launches AI 'trust layer' to tackle enterprise deployment failures plaguing 80% of projects
PositiveTechnology
Salesforce has launched a new AI 'trust layer' aimed at improving enterprise AI deployment, addressing a significant issue where over 80% of projects fail to provide real business value. This initiative is crucial as it not only enhances data management and governance but also seeks to build a reliable foundation for companies looking to leverage AI effectively. By tackling these challenges, Salesforce is positioning itself as a leader in the AI space, potentially transforming how businesses adopt and benefit from artificial intelligence.
HubSpot’s Dharmesh Shah on AI mastery: Why prompts, context, and experimentation matter most
PositiveTechnology
At this year's INBOUND conference in San Francisco, HubSpot's Dharmesh Shah emphasized the importance of mastering AI through effective prompts, context, and experimentation. This event brought together marketing and sales professionals to explore innovative ideas and tools, showcasing both new features like the Creators Corner and beloved staples like HubSpot Academy Labs. The insights shared are crucial for professionals looking to leverage AI in their strategies, making this conference a significant opportunity for growth and learning.
'Western Qwen': IBM wows with Granite 4 LLM launch and hybrid Mamba/Transformer architecture
PositiveTechnology
IBM has just unveiled Granite 4.0, a cutting-edge large language model that promises to deliver high performance while being cost-effective and memory-efficient. This launch is significant as it showcases IBM's commitment to innovation in AI technology, especially given its long history in the tech industry. With the hybrid Mamba/Transformer architecture, Granite 4.0 is set to enhance various applications, making advanced AI more accessible to businesses and developers alike.
Microsoft retires AutoGen and debuts Agent Framework to unify and govern enterprise AI agents
PositiveTechnology
Microsoft has announced the retirement of its AutoGen framework, which has been a key player in enterprise AI projects, particularly since the launch of AutoGen v0.4 earlier this year. The company is now introducing a new Agent Framework designed to unify its various AI offerings and enhance observability. This move is significant as it aims to streamline enterprise AI development, making it easier for businesses to manage and govern their AI agents effectively.
Databricks set to accelerate agentic AI by up to 100x with ‘Mooncake’ technology — no ETL pipelines for analytics and AI
PositiveTechnology
Databricks is making waves in the tech world with its new 'Mooncake' technology, which promises to enhance agentic AI capabilities by up to 100 times. This innovation eliminates the need for traditional ETL pipelines, streamlining the process for enterprises that rely on PostgreSQL databases. By simplifying data analysis and AI model integration, Databricks is not only reducing costs but also empowering businesses to leverage their operational data more effectively. This development is significant as it could reshape how companies approach data analytics and AI, making advanced technologies more accessible.