On my recent trip to San Francisco, I had the chance to experience a Waymo self-driving car, and it felt like stepping into the future. No driver. No human intervention. Just AI quietly taking charge of something we’ve always associated with human reflexes and instincts. At SilverFern Digital we keep a close eye on such breakthrough experiences- studying how products like these function, helps us absorb key learnings and put them to use in our AI-first products and everyday design practice. We broke it down: how is AI able to do this so seamlessly? 🔹 Studying Patterns: Waymo cars don’t just "react." They’ve been trained on millions of miles of driving data, learning the tiniest nuances of human and environmental behavior on the road. 🔹 Building Intelligent Systems: From perception (seeing pedestrians, cyclists, traffic signals) to prediction (anticipating how others might move), every decision is powered by a layered AI brain working in real time. 🔹 Cohesive UX & Trust: The magic isn’t just in the AI. It’s in how that intelligence is communicated back to passengers. Clear displays, intuitive cues, and subtle motions help you trust the car. That’s where UX becomes just as important as AI. This intersection of AI, UX, and automotive design is reshaping not just how cars move, but how we move, work, and live. For me, the ride wasn’t about tech; it was about how natural it felt to let go, to trust, and to experience safety redefined by design. The future of transportation isn’t just autonomous. It’s empathetic, data-driven, and deeply human-centered. As we build more AI-first products, these innovations inspire us to design new-age automotive experiences that push the boundaries of design, technology, and trust. What are your automotive transformation experiences? #AI #UXDesign #FutureOfMobility #SilverfernDesign
AI In Autonomous Vehicle Technology
Explore top LinkedIn content from expert professionals.
-
-
Autonomous Vehicles The AV revolution is underway. Driven by breakthroughs in AI, compute, and simulation, and dramatic cost reduction in sensors and hardware, Robotaxis are being tested in several U.S. cities. Globally there are more than 30 companies piloting/scaling fleets. In the U.S. there are 10 million workers who drive for a living: a) 3.5M truck drivers, b) 2M ride-hailing drivers (Uber, Lyft), c) 1M delivery van drivers (UPS, FedEx, Courier), d) 500k bus drivers (school & transit), e) 400k taxis, and f) 3M drivers in the GIG economy (food delivery) - representing 6.25% of the total workforce. Globally, there are ~400M workers globally that drive for a living. The truck driver or Uber driver replaced by AV is estimated to cut costs per mile by more than half. The implications are massive. In the U.S annually, auto accidents result in 44,000 fatalities, 2.3 million injuries with an economic cost of $350 billion annually (medical, productivity loss, property damage, legal expense). AVs are expected reduce accidents by 90%+. AI on wheels as one analyst labels it, is powered by neural networks, trained on billions of road miles (Waymo alone has logged 100 million with no human driver behind the wheel). Tesla recently launched its pilot program at a price point well below Uber ($4.20 per ride), while Uber itself plans to deploy 20,000 AV (no driver). Bank of America estimates a $1.2 trillion AV spend on robotaxis, logistics, delivery, agriculture, and public transit. This shift could redefine urban design, free up parking, reduce congestion, and accelerate the timeline for traditional auto ownership where more people use AVs on demand vs. owned vehicles. China may lead the race given its demographic urgency and regulatory structure, but the U.S. isn’t far behind. The winners will be OEMs who master software, data, hardware integration, cost-efficient assemblage. Key technology and components are Radar, LiDAR, Camera, Chips, Cockpit to console with nearly 100 companies providing parts, technology and components that has largely evolved beyond traditional auto parts suppliers My most immediate questions/issues related to the advancement of AV include: - Employment, and potential displacement of active drivers - Demand and profitability for the auto OEMs (GM, Ford, Stellantis vs. Tesla)—new car sales, adoption, fleet size, efficiency. - Auto Parts Supplier relevance in a AV transport world - Rental Car Companies (Avis, Hertz, Budget) vs. Robotaxi model - Auto Insurance, premium vs. payout model with fewer accidents and Tesla providing vehicle insurance from their insurance arm The auto sector has underperformed in 2025; credit spreads have widened. Stay tuned, it’s early days.
-
Autonomous vehicles leveraging advanced AI like Vision Transformers highlight the potential for safer, smarter transportation systems, where real-time decisions driven by enhanced image analysis could redefine how we navigate urban environments and beyond. Vision Transformers (ViTs) utilize attention mechanisms to process diverse visual inputs simultaneously, enhancing the accuracy of object recognition and decision-making in autonomous vehicles. ViTs require substantial investment in R&D, collaborative partnerships, and regulatory alignment to ensure safe and reliable integration. Training technical staff and gaining public trust remain essential steps for widespread adoption, while companies must also address the cost implications to position themselves competitively in a rapidly evolving market. #AI #AutonomousDriving #VisionTransformers #FutureMobility #Transportation
-
Transportation is the largest employment sector on Earth. Over 1 billion people globally work in roles directly tied to moving people or goods, drivers, operators, couriers, logistics staff. That industry is now facing a seismic shift. At Viva Technology #Paris, I got a hands-on look at Tesla’s new Robotaxi a fully autonomous vehicle with no steering wheel, no pedals, and no driver seat. Just sensors, AI, and minimalism. Here’s what we know: • Tesla plans to unveil the production version on August 8, 2025 • Initial manufacturing is already underway in Texas • Pricing aims to undercut public transport, not just Uber • It will operate via Tesla’s own ride-hailing app • First cities targeted: Austin, San Francisco, Los Angeles • No human driver — full autonomy powered by Tesla's FSD and Dojo AI stack • Global expansion dependent on regulatory approval and real-world test data Tesla isn’t alone. • Waymo (Alphabet) is running autonomous taxis in Phoenix and San Francisco • Cruise (GM) is paused after safety issues but plans to return • Baidu, Inc. and AutoX are already live in parts of China • Uber partnered with Waymo, but their core model faces existential risk The implications are massive: • Driving is the most common job in 29 US states • Millions of Uber, truck, and taxi drivers globally could be replaced • Cities may need to rethink urban infrastructure, licensing, and labor support • Investors will shift focus to platform owners, not fleet operators We’re not talking about a decade from now. We’re talking about product launches this year, pilots already active, and regulators being pushed to move fast. The transportation sector as we know it is approaching a turning point. Are we ready? #AutonomousVehicles #TeslaRobotaxi #FutureOfWork #TransportationDisruption #MobilityTech #AIandJobs #Tesla #Waymo #Cruise #UberFuture #DigitalTransformation #AIInnovation
-
The Future of Autonomous Vehicles: How GenAI is Accelerating Innovation . The future of fully autonomous vehicles (AVs) is accelerating, thanks to the transformative power of generative AI (GenAI). As highlighted in recent insights from CB Insights, #GenAI is breaking down key barriers that have long delayed the widespread adoption of self-driving #cars . (1) Enhancing In-Car Communication One major advancement is the enhancement of in-car voice assistants. GenAI-powered LLMs are bridging the communication gap between passengers and self-driving cars, evolving from pre-recorded commands to hyper-personalized, natural conversations. Imagine saying, “Let’s go pick up food at my favorite restaurant,” and your car seamlessly understanding and acting on it—a future that’s already within reach. (2) Reducing Training Costs Training costs are also being slashed through GenAI-simulated environments. These virtual settings allow AV systems to rack up millions of miles driven in a controlled, cost-effective manner, improving safety testing without the need for extensive real-world trials. This innovation is a game-changer for automakers aiming to refine their technology efficiently. (3) Improving Safety and Transparency Safety and transparency are critical for gaining regulatory trust, and GenAI is stepping up here too. By providing clear explanations for driving decisions—moving away from the “black box” approach—LLMs enhance accountability. For instance, a car detecting a pedestrian and explaining its stop decision in plain language builds confidence among regulators and passengers alike. (4) Strategic Partnerships To stay competitive, automakers must partner with automotive AI chip manufacturers capable of supporting local LLM processing. Factors like inference time, energy efficiency, and durability will be key in selecting the right technology partners. Meanwhile, car insurance providers are adapting by developing new risk assessment models, including provisions for cybersecurity threats, potentially collaborating with automotive cybersecurity firms. (5) Transforming Cars into Digital Platforms Looking ahead, GenAI is turning cars into digital platforms with agentic AI features. This opens doors for automakers and AV providers to team up with AI agent developers, creating smarter, more interactive vehicles. The UK AI #startup PhysicsX, nearing a $1 billion valuation, exemplifies this trend, developing advanced AI tools for automotive and #aerospace sectors that could further propel AV #innovation . EmpowerEdge Ventures
-
Researchers at Hong Kong University MaRS Lab have just published another jaw dropping paper featuring their safety-assured high-speed aerial robot path planning system dubbed "SUPER". With a single MID360 lidar sensor they repeatedly achieved autonomous one-shot navigation at speeds exceeding 20m/s in obstacle rich environments. Since it only requires a single lidar these vehicles can be built with a small footprint and navigate completely independent of light, GPS and radio link. This is not just #SLAM on a #drone, in fact the SUPER system continuously computes two trajectories in each re-planning cycle—a high-speed exploratory trajectory and a conservative backup trajectory. The exploratory trajectory is designed to maximize speed by considering both known free spaces and unknown areas, allowing the drone to fly aggressively and efficiently toward its goal. In contrast, the backup trajectory is entirely confined within the known free spaces identified by the point-cloud map, ensuring that if unforeseen obstacles are encountered or if the system’s perception becomes uncertain, the system can safely switch to a precomputed, collision-free path. The direct use of LIDAR point clouds for mapping eliminates the need for time-consuming occupancy grid updates and complex data fusion algorithms. Combined with an efficient dual-trajectory planning framework, this leads to significant reductions in computation time—often an order of magnitude faster than comparable SLAM-based systems—allowing the MAV to operate at higher speeds without sacrificing safety. This two-pronged planning strategy is particularly innovative because it directly addresses the classic speed-safety trade-off in autonomous navigation. By planning an exploratory trajectory that pushes the speed envelope and a backup trajectory that guarantees safety, SUPER can achieve high-speed flight (demonstrated speeds exceeding 20 meters per second) without compromising on collision avoidance. If you've been tracking the progress of autonomy in aerial robotics and matching it to the winning strategies emerging in Ukraine, it's clear we're likely to experience another ChatGPT moment in this domain, very soon. #LiDAR scanners will continue to get smaller and cheaper, solid state VSCEL based sensors are rapidly improving and it is conceivable that vehicles with this capability can be built and deployed with a bill of materials below $1000. Link to the paper in the comments below.
-
I am delighted to share an example illustrating how to enable a Nonholonomic Wheeled Mobile Robot (WMR) to avoid obstacles and overcome barrier limitations. This is achieved by integrating the Artificial Potential Field (APF) method, Feedback linearization (FL) with discrete feedforward gains for trajectory tracking, and the Unscented Kalman Filter (UKF) to estimate longitudinal and lateral positions, these techniques drive the mobile robot to track a predefined trajectory, ensuring it visits all waypoints while adhering to environmental constraints.simulation validate the performance of the control strategy. The 2D trajectory tracking problem of WMR is to propose a virtual mobile robot model which can perfectly track the ideal space trajectory and then let the actual robot track. on the spatial coordinates, the desired movement ,different robot states, such as angular velocity, linear velocity, and spatial location, It is the process of constructing a two-dimensional map of the known environment with specific obstacles and barriers. Since the model is highly nonlinear and involves nonholonomic constraints, a nonlinear control scheme is more suitable than a linear one. Therefore, we design an FL-based control, followed by the formulation of QP problems in discrete time, to address the LTV problem. This approach minimizes trajectory oscillations by penalizing velocity errors in the quadratic cost function. The UKF is well-suited for systems with nonlinear dynamics and nonlinear measurement models, which are typical in WMR. the UKF employs the unscented transform to more accurately capture the mean and covariance of a probability distribution under nonlinear transformations, It is also a tool used within the SLAM framework to solve the localization and mapping problem in scenarios with nonlinear dynamics and measurements. The two are deeply interrelated, with UKF enhancing the accuracy and reliability of SLAM systems. The APF models the environment using virtual forces, attractive forces pull the WMR toward the target, while repulsive forces push it away from obstacles. The combination of these forces generates a potential field that guides the robot along a safe and efficient path to its goal. However, the standard APF method has limitations, such as the local minimum problem, where the robot can become stuck in a region without reaching the target. Enhancements, by introducing a secondary virtual random perturbations, help overcome this issue. In Summary, local path planning is critical for addressing active safety challenges in autonomous robots, offering strong real-time performance and computational efficiency. While algorithms such as A*, model-based RL, genetic algorithms, ant colony optimization, and particle swarm optimization have their advantages, the (APF) method stands out for its simplicity, rapid computation, and effectiveness in obstacle avoidance, making it particularly suitable for dynamic and complex environments
-
+15
-
How do you search a large area with drones without telling each one where to go? That's the Autonomous Aircraft Search & Service (A2S2) problem. Think search & rescue after a natural disaster, wildfire detection over forests, or infrastructure inspection across hundreds of square miles. The brute-force approach (dividing the map into equal zones and sweep) falls apart fast. Targets aren't evenly distributed. Terrain changes. Some areas matter more than others. And you don't know what you don't know. So I took a different approach: Each drone maintains a shared belief map which is a probability grid of where targets might still be hiding. As drones sweep areas and find nothing, belief decays. When a sensor picks up a signal, belief spikes. The decision engine uses Hamiltonian optimal transport (the same math behind logistics routing and fluid dynamics) to compute where each drone should fly next. It balances two things: go where the belief is highest, but don't send all 8 drones to the same spot. The result: 8 drones, 50 targets, 10,000-cell grid. All 50 targets found in 295 steps. No central coordinator. No pre-planned paths. Just local observations and shared belief. The simulation below shows it in action. Left panel: drones (colored markers) hunting targets (red stars) across a 100x100 grid. Right panel: belief map fading to dark as the area gets explored. This is a building block for real-world autonomous search where the map is bigger, the sensors are noisier, and the stakes are higher. Code will be open. More to come. #AutonomousSystems #Drones #UAV #Robotics #OptimalTransport #SearchAndRescue #MultiAgentSystems #AI
-
Regardless of what side of the AI debate you find yourself. Truth is, AI is here, it’s evolving, and it will become a key component of everything we do in the future. You can’t stop evolution. AI presents numerous opportunities to revolutionize public transportation, paving the way for more efficient, sustainable, and user-friendly systems. Here are some of the key opportunities I’m keeping my eyes on: Optimized Routes and Schedules: AI can dynamically adjust routes and schedules based on real-time data, reducing travel times and improving punctuality. Traffic Flow Management: AI can optimize traffic signals and manage congestion, prioritizing public transportation vehicles and enhancing overall traffic flow. Predictive Maintenance: By predicting and addressing maintenance needs before failures occur, AI can reduce repair costs and extend the lifespan of vehicles and infrastructure. Energy Management: AI can optimize energy usage for electric buses and trains, leading to significant cost savings and reduced environmental impact. Real-time Surveillance: AI-powered video analysis can enhance security by detecting suspicious activities and potential threats in real-time. Incident Prediction and Prevention: AI can predict potential accidents or safety issues, allowing for proactive measures to be taken. Personalized Travel Information: AI can provide personalized travel recommendations, real-time updates, and customer support through chatbots and virtual assistants. Seamless Payment Systems: AI can facilitate smart ticketing systems with dynamic pricing and contactless payments, making the payment process smoother for passengers. Smart Resource Allocation: AI can help deploy resources more efficiently, reducing waste and improving the sustainability of transportation networks. Demand Prediction: AI can analyze patterns to forecast future transportation needs, aiding in better planning and resource allocation. Multi-modal Transport Solutions: AI can integrate various modes of transportation (e.g., buses, trains, bikes, ridesharing) into a cohesive system, providing users with seamless end-to-end travel options. Solving the last mile paradigm. Smart City Initiatives: AI in public transportation can be part of broader smart city initiatives, improving overall urban mobility and connectivity. Enhanced Analytics: AI can process vast amounts of data to provide insights and support decision-making processes for transportation authorities and operators. Performance Monitoring: Continuous monitoring and analysis of system performance can lead to ongoing improvements and innovation in public transportation.
-
🎇 On National Day, I went for a leisurely drive in San Francisco and ended up "stress-testing" a Waymo self-driving car on the road. 🚗 As an autonomous driving practitioner, who wouldn't be curious about the real-time robustness of the cutting-edge Waymo One driving system? While cruising downtown, I accidentally noticed a Waymo car tailgating me. While this isn't unusual for SF citizens, a wild, "evil" idea suddenly hit me: Why not directly "adversarially attack" the world's autonomous driving status quo? I executed an unexpected maneuver—suddenly reversing—to see how it would react. 🌟 The response was stellar! At the moment I reversed, Waymo One honked instantly—quicker than any human could—activated its hazard lights, and backed away to maintain a safe distance. This reckless move on my part served as an edge case to test the algorithm's robustness under extreme conditions and, potentially, could be a challenging training sample to enhance Waymo's future autonomous systems. 💫 Thrilled to personally experience how current cutting-edge autonomous algorithms handle rare driving behaviors—and how stable and safe Level 4 autonomy is in dealing with diverse scenarios. However, it also prompted deep reflection as an AI researcher in this field: 🤔 In an industry with little room for error, how can we ultimately avoid or minimize issues that AI fails to handle? 💡 I believe two research directions are particularly promising for achieving Level 5 level autonomy in future mobility systems: - 1️⃣ Development and deployment of vehicle-to-everything (V2X) cooperative systems (including V2V, V2I, V2P, etc). Our initial studies (e.g., V2X-ViT, ECCV'2022 arxiv.org/abs/2203.10638) show that in scenarios with severe occlusions or noise, such cooperative systems can significantly enhance the robustness of perception systems, thereby eventually improving traffic safety. - 2️⃣ Adversarial scenarios generation (including Sim-to-Real, generative modeling). Research done by my colleagues at UCLA (V2XP-ASG, ICRA'2023 https://lnkd.in/gu5nKVHD) shows that adversarial learning techniques can effectively simulate adversarial scenarios, greatly improving model robustness in complex "corner case" situations. Of course, it's often infeasible to collect such collision scenarios. 👨🏫 As a new Assistant Professor of CS at Texas A&M University, I will lead a group focusing on these exciting research directions, which can be a proactive approach to reducing accidents and improving safety for all. 🔥 I look forward to future collaborations with governments, academics, and companies to research and develop data and algorithms that can help enhance the safety of vulnerable road users, especially seniors and children. We envision a people-centered intelligent transportation system in the future. Interested in these topics? Let's connect and discuss further! #AutonomousVehicles #AI #MachineLearning #SmartCities #Transporation #Humanity #Mobility
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development