Socially Optimal Personalized Routing with Preference Learning

Spurred by rapid population growth and city development, traffic congestion has become inescapable in metropolitan areas across the United States (U.S.) and its direct and indirect effects can be dire. Indeed, congestion can severely impede quality of life, negatively impact health and productivity, and increase commuting costs and pollution. At the same time, support for increased taxation to fund expansion of the existing road network to meet current and future needs is in short supply. For these reasons, it is imperative to find novel ways to improve routing efficiency over the existing infrastructure. A major obstacle to improve routing efficiency of real-world transportation networks is the inability to enforce socially optimal routes upon the commuters, whom tend to act in an egocentric fashion without regard for the impact of their choices on the experience of the other commuters. The gap between the efficiency of the socially optimal (utopic) solution and the equilibrium (de facto) solution is known as the Price of Anarchy. A natural way to improve routing efficiency without infrastructure works is to incentivize commuters to stick to socially optimal routes. Offering financial incentives presents significant practical and political challenges. On the other hand, transit agencies can leverage heterogeneity in the preferences of the commuters (at no cost) to (a) better spread traffic in the network to achieve a socially optimal routing scheme, and (b) propose personalized routes that are likely to be adhered to by the commuters. This project will propose to learn individual driver preferences over the route characteristics using their past route choices to recommend socially optimal routes that they will follow with high probability. The project will envisage that the recommended routes will result in improved routing efficiency relative to the de facto routes since they will optimize social welfare. At the same time, the proposed routes will be implementable in practice. These combined effects will bridge the gap between the utopic and de facto solutions, thereby reducing the Price of Anarchy. The project will take the point of view of a recommendation system with high number of users but with no ability to enforce recommended routes in a highly congested traffic network. The project will propose to: (a) develop a machine learning framework for learning individual driver preferences over time; (b) devise a mathematical mode land solution scheme for computing personalized equilibrium routes given limited and imperfect information on the driver preferences; (c) leverage this model to compute personalized implementable socially optimal routes; (d) quantify the reduction in the Price of Anarchy achieved by our framework in stylized problems; and (e) showcase the performance of this approach on real data.


    • English


    • Status: Active
    • Contract Numbers:


    • Sponsor Organizations:

      METRANS Transportation Center

      University of Southern California
      Los Angeles, CA  United States  90089-0626

      National Center for Metropolitan Transportation Research

      University of Southern California
      650 Childs Way, RGL 107
      Los Angeles, CA  United States  90089-0626

      California Department of Transportation

      1227 O Street
      Sacramento, CA  United States  95843

      Office of the Assistant Secretary for Research and Technology

      University Transportation Centers Program
      Department of Transportation
      Washington, DC  United States  20590
    • Project Managers:

      Feldman, Doug

    • Principal Investigators:

      Vayanos, Phebe

      Dessouky, Maged

    • Start Date: 20170930
    • Expected Completion Date: 20180930
    • Actual Completion Date: 0

    Subject/Index Terms

    Filing Info

    • Accession Number: 01642999
    • Record Type: Research project
    • Source Agency: National Center for Metropolitan Transportation Research
    • Contract Numbers: 17-05
    • Files: UTC, RiP
    • Created Date: Aug 1 2017 6:23PM