Predicting California Bearing Ratio with Ensemble and Neural Network Models: A Case Study from T\"urkiye

arXiv — cs.LGWednesday, December 10, 2025 at 5:00:00 AM
  • A study has introduced a machine learning framework for predicting the California Bearing Ratio (CBR) using a dataset of 382 soil samples from various geoclimatic regions in Tükiye. This approach aims to enhance the accuracy and efficiency of CBR determination, which is crucial for assessing the load-bearing capacity of subgrade soils in infrastructure projects.
  • The development of this machine learning model signifies a shift towards more efficient and cost-effective methods in geotechnical engineering. By reducing reliance on traditional laboratory tests, this innovation could facilitate quicker assessments and improve decision-making in transportation infrastructure and foundation design.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Recommended apps based on your readingExplore all apps
Continue Readings
Harnessing AI to solve major roadblock in solid-state battery technology
PositiveArtificial Intelligence
Researchers at Edith Cowan University are leveraging artificial intelligence (AI) and machine learning to enhance the reliability of solid-state batteries, addressing a significant challenge in battery technology. This initiative aims to improve performance and safety in energy storage solutions.
Unsupervised Learning of Density Estimates with Topological Optimization
NeutralArtificial Intelligence
A new paper has been published on arXiv detailing an unsupervised learning approach for density estimation using a topology-based loss function. This method aims to automate the selection of the optimal kernel bandwidth, a critical hyperparameter that influences the bias-variance trade-off in density estimation, particularly in high-dimensional data where visualization is challenging.
High-Throughput Unsupervised Profiling of the Morphology of 316L Powder Particles for Use in Additive Manufacturing
PositiveArtificial Intelligence
A new automated machine learning framework has been developed to profile the morphology of 316L powder particles for Selective Laser Melting (SLM) in additive manufacturing. This approach utilizes high-throughput imaging, shape extraction, and clustering to analyze approximately 126,000 powder images, significantly enhancing the characterization process compared to traditional methods.
Reading the immune clock: a machine learning model predicts mouse immune age from cellular patterns
NeutralArtificial Intelligence
A recent study published in Nature — Machine Learning presents a machine learning model capable of predicting the immune age of mice based on cellular patterns. This innovative approach leverages complex data analysis to enhance understanding of immune system aging, potentially leading to advancements in immunology and age-related research.
Deep Learning and Machine Learning, Advancing Big Data Analytics and Management: Unveiling AI's Potential Through Tools, Techniques, and Applications
PositiveArtificial Intelligence
Recent advancements in artificial intelligence (AI), particularly in machine learning and deep learning, are significantly enhancing big data analytics and management. This development focuses on large language models (LLMs) like ChatGPT, Claude, and Gemini, which are transforming industries through improved natural language processing and autonomous decision-making capabilities.
Interpretive Efficiency: Information-Geometric Foundations of Data Usefulness
NeutralArtificial Intelligence
A new concept called Interpretive Efficiency has been introduced, which quantifies how effectively data supports interpretive representations in machine learning. This measure is grounded in five axioms and relates to mutual information, providing a framework for assessing the usefulness of data in interpretive tasks.
Precise Liver Tumor Segmentation in CT Using a Hybrid Deep Learning-Radiomics Framework
NeutralArtificial Intelligence
A novel hybrid framework has been introduced for precise liver tumor segmentation in CT scans, combining an attention-enhanced U-Net with handcrafted radiomics and voxel-wise 3D CNN refinement. This approach aims to improve the accuracy and efficiency of tumor delineation, addressing challenges such as low contrast and blurred boundaries in imaging.
GPU-GLMB: Assessing the Scalability of GPU-Accelerated Multi-Hypothesis Tracking
NeutralArtificial Intelligence
Recent research has focused on the scalability of GPU-accelerated multi-hypothesis tracking, particularly through the Generalized Labeled Multi-Bernoulli (GLMB) filter, which allows for multiple detections per object. This method addresses the computational challenges associated with maintaining multiple hypotheses in multi-target tracking systems, especially in distributed networks of machine learning-based virtual sensors.