News 2019

August 2019

Rayid Ghani, Pioneer in Applying AI to Social Issues, Joins Carnegie Mellon

Byron Spice

Rayid Ghani, a pioneer in using data science and artificial intelligence to solve major social and policy challenges and the former chief scientist for Barack Obama's 2012 re-election campaign, will join the Carnegie Mellon University faculty this fall. Ghani, a CMU alumnus, will be a Distinguished Career Professor with a joint appointment to the Heinz College of Information Systems and Public Policy and the School of Computer Science. In his new role at CMU, Ghani will use his joint appointment to the Heinz College and School of Computer Science to continue his work as a leader in harnessing the potential of artificial intelligence, data science and other emerging technologies for social good. His work in these areas complements the research being done in CMU's Metro21: Smart Cities Institute and the Block Center for Technology and Society. "My focus over the last several years has been on enabling the use of AI, machine learning and data science to tackle large-scale public policy and social impact problems in an ethical and equitable manner, whether it was with government agencies, foundations, nonprofits or the private sector," Ghani said. "I look forward to joining a world-class institution that shares those same values around the use of technology to create positive and equitable social outcomes." Ghani comes to CMU from the University of Chicago, where he was director of the Center for Data Science and Public Policy, research associate professor in the Department of Computer Science, and a senior fellow in the Harris School of Public Policy. At the university, he developed the Data Science for Social Good Fellowship to equip the next generation of statisticians, computer scientists and social scientists with the skills necessary to address real-world and large-scale social problems using data science and machine learning. Previously, Ghani served as chief scientist for the Obama for America 2012 Election Campaign, where he specialized in using data, analytics and technology to reach, persuade and mobilize voters, donors and volunteers. Prior to that, he was a senior research scientist and director of analytics research at Accenture Labs. Ghani received a master's degee in machine learning at Carnegie Mellon, where he studied under Tom Mitchell, the Founders University Professor of Machine Learning and Computer Science and originator of the world's first Machine Learning Department. "Rayid has played a pivotal role in shaping our field's understanding of how the innovative technologies CMU has developed can be implemented for the greater social good," said Martial Hebert, dean of the School of Computer Science. "We look forward to welcoming him back to the Carnegie Mellon community, where his invaluable insight into analytics, machine learning and data science will continue to drive the social impact of CMU's research centers." Ghani serves on the Board of Directors of the Data Science for Social Good Foundation, as well as on the Steering Committee for the AI for Good Foundation. In 2014, he was named a "Young Global Leader" by the World Economic Forum. "Rayid's understanding of the greater impacts of technology on society — and its potential for promoting shared prosperity — makes him a perfect addition to our team," said Ramayya Krishnan, dean of the Heinz College. "We are excited to see how Rayid will contribute to advancing the mission of our research centers, as well as of Carnegie Mellon as a whole."

Herbsleb, Narasimhan Named Interim Heads of Software Research, Robotics

Byron Spice

School of Computer Science Dean Martial Hebert has named James Herbsleb to serve as interim director of the Institute for Software Research and Srinivasa Narasimhan to serve as interim director of the Robotics Institute. Herbsleb, an ISR professor, will begin his new role at the end of August, when current director Bill Scherlis goes on leave. Narasimhan will take the place of Hebert, who became SCS dean as of Aug. 15. Both will serve until permanent leaders are identified for their departments. Herbsleb is best known for his research in collaboration and coordination on large-scale software engineering projects, and developing and testing a theory of coordination that brings together the technical and human aspects of software development. He has addressed such topics as how development teams can function and collaborate even when they are geographically dispersed. He also has explored issues related to open-source development, both in individual projects and in large-scale ecosystems of interdependent projects. Herbsleb has received multiple awards, most recently the Outstanding Research Award presented by the Association for Computing Machinery's Special Interest Group on Software Engineering (SIGSOFT). Narasimhan, a professor in the Robotics Institute, has established a notable sensing group within the institute. His group focuses on novel techniques for imaging, illumination and light transport to enable applications in vision, graphics, robotics, intelligent transportation, smart cities, agriculture and medical imaging. He has led development of such innovations as programmable headlights, and is leading research into non-line-of-sight imaging. He is associate director of a National Science Foundation Expeditions in Computing that is developing cameras to see deep beneath the skin. Narasimhan was the inaugural director of the Robotics Institute's first-of-its-kind master's degree in computer vision. He has won a wide variety of awards and best paper citations.

Carnegie Mellon Robotics Team Wins Initial DARPA Event

Byron Spice

Team Explorer from Carnegie Mellon University and Oregon State University deployed robots to autonomously map and search underground mines and outscored 10 competing teams at the initial scored event in the DARPA Subterranean Challenge. On four occasions during the eight-day event, each team deployed multiple robots into National Institute for Occupational Safety and Health research mines in South Park Township, Pennsylvania. The robots navigated on their own for an hour at a time as they searched for objects, such as simulated human survivors, in a mine-disaster scenario. Team Explorer detected and pinpointed 25 of these artifacts in its two best runs, 14 more than any other team. In addition to being named champions of the event, the team was also cited for "Most Accurate Artifact" for identifying and locating a backpack inside a mine less than 20 centimeters from its actual position. "All the teams worked very hard to get here, and each took a slightly different approach to the problem," said Matt Travers, a system scientist in CMU's Robotics Institute and co-leader of Team Explorer. "This was a great experience for all of us and we are proud of the performance by our team members and our robots." The event, which concluded today in Pittsburgh, was sponsored by the Defense Advanced Research Projects Agency. Called the Tunnel Circuit, it was the first of four events planned for the Subterranean Challenge, which will conclude two years from now with one team claiming a $2 million grand prize. The challenge will develop technologies needed by military and civilian first responders when faced with damaged underground environments suspected to be unsafe for humans. To accomplish their mission in the Tunnel Circuit, the robots needed to cope with mine conditions such as mud and water, and to communicate with each other and with a base station outside the mine despite the radio limitations inherent in underground operations. "Mobility was a big advantage for us," said Sebastian Scherer, an associate research professor in the Robotics Institute and co-leader of the team. Unlike most teams, which relied on off-the-shelf robots, Explorer designed and built its two ground vehicles and two drones specifically to operate in the mines. "We had big wheels and lots of power," he added, "and autonomy that just wouldn't quit." Geoff Hollinger, an assistant professor of mechanical engineering at Oregon State and a CMU robotics alumnus, and his students provided additional expertise in multirobot systems. As typical of CMU robotics, the team tested their robots and their procedures rigorously — in this case at the Tour-Ed Mine in Tarentum, Pennsylvania, prior to the event. "The testing was brutal at the end," Scherer said, with early morning meetings to resolve problems from the previous day, followed by long days at the mine. "But it paid off. We were prepared for this." The team includes about 30 faculty, students and staff members from Carnegie Mellon and from Oregon State. The team benefited from previous CMU work on mining applications and from experience in other DARPA challenges, including its victory in the 2007 DARPA Urban Challenge robot race that popularized self-driving cars. Future events in the Subterranean Challenge will include an Urban Circuit, where robots will explore complex underground facilities; and a Cave Circuit, where the robots will operate in natural caves. Explorer is one of seven teams that will receive up to $4.5 million from DARPA to develop their hardware and software for the competition. The team is also sponsored by the Richard King Mellon Foundation, Schlumberger, Microsoft, Boeing, FLIR Systems, Near Earth Autonomy, Epson, Lord and Doodle Labs. Support the team by making a secure, online contribution.

Study Shows Apps Are Rife With Privacy Compliance Issues

Daniel Tkacik

Android users can choose from more than 2.7 million apps in the Google Play Store — a daunting number for a privacy researcher who wants to investigate if those apps comply with privacy laws. But fear not, privacy researchers. There's a new tool in town, and it's revealed some eye-opening data about the state of privacy for Android apps. A team of researchers from Carnegie Mellon University and Fordham University recently created the Mobile App Privacy System (MAPS), a tool that uses natural language processing, machine learning and code analysis to identify potential privacy compliance issues by inspecting apps' privacy policies and code. The researchers tested MAPS on more than a million Android apps, and presented their findings at last month's Privacy Enhancing Technologies Symposium in Stockholm, Sweden. "The sheer number of apps in app stores, combined with their complexity and all of the different third-party interfaces they may use, make it impossible for regulators to systematically look for privacy compliance issues," said Norman Sadeh, a professor in the Institute for Software Research and principal investigator on the study. "This tool provides a system for systematically identifying potential privacy issues at scale, and can be customized to help app store operators or regulators focus on issues relevant to specific privacy regulations." The tool also allows users to filter privacy results to focus on, such as apps with more than a certain number of downloads, specific categories of apps or particular types of potential compliance issues. To analyze the state of privacy for Android apps, the team used MAPS to analyze 1,039,003 apps downloaded from the Google Play Store. The analysis took the tool about three weeks, working at an average rate of 2,023 apps per hour. The researchers found that nearly half of the apps did not have privacy policy links, despite most of them (89%) suggesting that their code engages in at least one practice that would require disclosure under a particular jurisdiction. "When policies do exist, many seem to inaccurately portray the practices performed by the app," said Sadeh, who is also affiliated with Carnegie Mellon's CyLab Security and Privacy Institute. "For example, 12% of apps' policies did not seem to accurately describe how the app is handling your location data." Sadeh cautioned that these results require further manual vetting, because not all potential compliance issues are necessarily actual violations. For instance, code that may appear to share sensitive information with third parties may not actually be executed. "For practical reasons, we were only able to fully vet a tiny fraction of our results, but many of those results that were checked proved to correspond to actual compliance issues," Sadeh said. "In particular, the tool was used as part of a project with a large European electronics manufacturer to check several of their mobile apps for compliance with the European General Data Protection Regulation (GDPR)." On average, the researchers found about three potential privacy compliance issues per app. They also found that while newer apps were more likely to have privacy policies, they also had more potential issues than older apps. "Overall, we found that Google's efforts to push developers to post privacy policies may not be enough," Sadeh said. "Developers may not be able or willing to adequately describe their apps' behaviors without proper tools and incentives." Sadeh further noted that this particular study was conducted just before the GDPR took effect. Under GDPR, companies are subject to more stringent disclosure requirements and face steeper penalties for not complying. "One can hope that with GDPR, the number of compliance issues will diminish over time," Sadeh said. "At the same time, our research as well as that of others suggests that many app developers simply lack the sophistication and the resources necessary to be fully compliant. This is an area where, in my view, App Store operators should be more proactive and provide additional support to app developers." Other researchers on the study included former CMU computer science postdoctoral associate Sebastian Zimmeck; graduate students Peter Story, Daniel Smullen, Abhilasha Ravichander and Ziqi Wang; and Fordham University law faculty members Joel Reidenberg and N. Cameron Russell.

New CMU Faculty Member Offers Crash Course in AI

Virginia Alvino Young

A new way to learn about artificial intelligence premiered on YouTube this month as part of Crash Course, a series of quick-paced, imaginative videos on a range of topics that's amassed hundreds of millions of views. The AI installment, covering everything from algorithmic bias to neural networks, was co-written by incoming CMU faculty member Yonatan Bisk. "This is increasingly pervasive technology, so it's good to understand that all of these systems in your daily life are running AI under the hood," Bisk said. Currently a post-doc at Microsoft Research, Bisk will join CMU's School of Computer Science in the fall of 2020. He co-wrote the AI series with Tim Weninger of Notre Dame, and Lana Yarosh of the University of Minnesota, and is the primary script writer on episodes about natural language processing, robotics and the future of AI. Other topics in the series include robotics in game playing, and the potentially problematic aspects of AI. Many Crash Course viewers are young people who may or may not go on to work in computer science, but Bisk said that it's important for anyone reading the news or being impacted by AI-related government policies to understand the terminology. "To most people, all AI things are a black box. Hopefully, if nothing else, this series provides viewers with a certain level of agency in terms of playing around in the space again, but also a certain realism about what the technology can actually do," Bisk said. Unlike its predecessors, this installment of Crash Course includes hands-on labs. For some topics, supplementary videos will be provided that walk viewers through sample code they can downloaded to their own browser. For Bisk, it's all about inclusion. By giving people sample code, "they can either literally reproduce what they just saw in the YouTube video, or they can experiment by changing data, learning algorithms and seeing what effect that has on model performance or predictions," he said. "The more you dig in, the more Python you'll need to know, but even beginners can get started and experiment." The series includes 15 episodes of curriculum, and five episodes of hands-on labs, all of which are hosted by YouTube personality Jabril Ashe. Crash Course is executive produced by brothers Hank and John Green with PBS Digital Studios. A 2017 installment on computer science was co-written by CMU Human-Computer Interaction Institute faculty members Amy Ogan and Chris Harrison.

#MeToo Media Coverage Sympathetic to but Not Necessarily Empowering for Women

Virginia Alvino Young

The #MeToo movement has encouraged women to share their personal stories of sexual harassment. While the movement amplifies previously unheard voices, a Carnegie Mellon University analysis of #MeToo media coverage shows accusers are often portrayed as sympathetic, but with less power and agency than their alleged perpetrators. "The goal of the movement is to empower women, but according to our computational analysis that's not what's happening in news stories," said Yulia Tsvetkov, assistant professor in the School of Computer Science's Language Technologies Institute. Tsvetkov's research team used natural language processing (NLP) techniques to analyze online media coverage of #MeToo narratives that included 27,602 articles in 1,576 outlets. In a paper published earlier this year, they also looked at how different media outlets portrayed perpetrators, and considered the role of third-party actors in news stories. "Bias can be unconscious, veiled and hidden in a seemingly positive narrative," Tsvetkov said. "Such subtle forms of biased language can be much harder to detect and to date we have no systematic way of identifying them automatically. The goal of our research was to provide tools to analyze such biased framing." Their work draws insights from social psychology research, and looks at the dynamics of power, agency and sentiment, which is a measurement of sympathy. The researchers analyzed verbs to understand their meaning, and put them into context to discern their connotation. Take, for instance, the verb "deserves." In the sentence "The boy deserves praise," the verb takes on a very different meaning than in the context of "The boy deserves to be punished." "We were inspired by previous work that looked at the meaning of verbs in individual sentences," Tsvetkov said. "Our analysis incorporates context." This method allowed her team to consider much longer chunks of text and to analyze narrative. The research team developed ways to generate scores for words in context, and mapped out the power, sentiment and agency of each actor within a news story. Their results show that the media consistently presents men as powerful, even after sexual harassment allegations. Tsvetkov said this threatens to undermine the goals of the #MeToo movement, which is often characterized as "empowerment through empathy." The team's analysis also showed that the people portrayed with the most positive sentiment in #MeToo stories were those not directly involved with allegations, like activists, journalists or celebrities commenting on the movement, such as Oprah Winfrey. A supplementary paper extending the analysis was presented by graduate student Anjalie Field in Florence, Italy, last month at the Association for Computational Linguistics conference. This paper proposes different methods for measuring power, agency and sentiment, and analyzes the portrayals of characters in movie plots, as well as prominent members of society in general newspaper articles. One of the consistent trends detected in both papers is that women are portrayed as less powerful than men. This was evident in an analysis of the 2016 Forbes list of most powerful people. In news stories from myriad outlets about women and men who ranked similarly, men were consistently described as being more powerful. "These methodologies can extend beyond just people," Tsvetkov said. "You could look at narratives around countries, if they are described as powerful and sympathetic, or unfriendly, and compare that with reactions on social media to understand the language of manipulation, and how people actually express their personal opinions as a consequence of different narratives." Tsvetkov said she hopes this work will raise awareness of the importance of media framing. "Journalists can choose which narratives to highlight in order to promote certain portrayals of people," she said. "They can encourage or undermine movements like #MeToo. We also hope that the tools we developed will be useful to social and political scientists, to analyze narratives about people and abstract entities such as markets and countries, and to improve our understanding of the media landscape by analyzing large volumes of texts."

Carnegie Mellon Team Flexes Hacking Prowess, Wins Fifth DefCon Title

Daniel Tkacik

Carnegie Mellon University’s competitive hacking team, the Plaid Parliament of Pwning (PPP), just won its fifth hacking world championship in seven years at this year’s DefCon security conference, widely considered the “World Cup” of hacking. The championship, played in the form of a virtual game of ''capture the flag,'' was held August 8-11 in Las Vegas. PPP now holds two more DefCon titles than any other team in the 23-year history of DefCon hosting the competition. ''If you’re wondering who the best and brightest security experts in the world are, look no further than the capture the flag room at DefCon,'' said David Brumley, a professor of Electrical and Computer Engineering at Carnegie Mellon, and the faculty advisor to the team. Three of the five biggest data breaches ever have occurred in the past 12 months, leaking nearly 2 billion personal records. For security experts trying to defend against these types of attacks, the annual DefCon conference provides an opportunity to hone their skills and practice on one another. ''These competitions are so much more than just games,'' said Zach Wade, a student in Carnegie Mellon’s School of Computer Science and one of PPP’s team captains. ''They bring together the security community to share and test new ideas that can be used to strengthen the security of the systems and devices we use every day.'' Over the course of the 72-hour hacking spree, teams made up of students, industry workers, and government contractors attempted to break into each other’s systems, stealing virtual ''flags'' and accumulating points. To add drama, team scores were hidden from view on the second day, and scores and rankings were hidden on the last day, sending teams into a hacking frenzy. ''Our team’s success reflects our dedication to training the problem solvers of the future,'' says Jon Cagan, interim dean of Carnegie Mellon’s College of Engineering. This year’s competition consisted of 16 pre-qualified teams with members from at least seven countries around the world. Team ''HITCONxBfKin'' from Taiwan placed second overall, with team ''Tea Deliverers'' from China trailing in third. The Carnegie Mellon hacking team first formed in 2009 and began competing at DefCon in 2010. The team previously won the contest in 2013, 2014, 2016, and 2017.

Carnegie Mellon, Oregon State Robotics Team Prepares for Subterranean Challenge

Byron Spice

A pair of wheeled robots and a pair of drones, assembled by researchers at Carnegie Mellon University and Oregon State University, will work together to autonomously map and search an underground mine as competition begins this week in the $2 million DARPA Subterranean Challenge.The first scored event in the multiyear competition, sponsored by the Defense Advanced Research Projects Agency, will take place Aug. 15–22 in a research mine operated by the National Institute for Occupational Safety and Health in South Park Township, outside Pittsburgh.The Carnegie Mellon/Oregon State team, Explorer, is one of 11 teams DARPA has qualified to compete in the event, where the robots will face a mine disaster scenario. The robots will be scored on their ability to develop a 3D map of the mine and identify a variety of objects positioned in it, including simulated human survivors."This is a task that requires robot autonomy, perception, networking and mobility for us to be successful," said Sebastian Scherer, who leads the team with Matt Travers, both of whom are faculty members in CMU's Robotics Institute. "Underground operations pose many unique challenges for robots, but we've benefited from the Robotics Institute's depth of experience in developing robots that can work in enclosed spaces and dark, dank environments."Geoff Hollinger, an assistant professor of mechanical engineering at Oregon State and a CMU robotics alumnus, will work with his students to provide additional expertise in multirobot systems.The team, which includes about 30 faculty, staff members and students, has tested its robots and procedures extensively at the Tour-Ed Mine in Tarentum, Pennsylvania.One of the team's major challenges has been maintaining communications between the robots and the human operator who oversees them from outside the mine, said Steven Willits, Explorer's lead test engineer. The rock walls block radio signals, which means radios are largely useless unless they are in line of sight with each other. So the ground robots, named Rocky 1 and Rocky 2, periodically drop Wi-Fi nodes on the mine floor, creating a communications network as they go.Even so, the number of nodes they carry is limited, so the robots eventually must venture beyond their ad hoc network, operating autonomously to gather data, said Kevin Pluckter, a master's student in robotics who is the lead operator. The robots will relay that information back to the operator once they return within range of the Wi-Fi network.Under DARPA's rules, the teams will have 60 minutes to complete their mapping and search missions. Both Rocky 1 and Rocky 2 can run the entire time, but the drones have more limited flight times. The drones will therefore be used when the ground robots meet an obstruction they can't surmount, flying ahead to complete the mission.The Tunnel Circuit is one of three events leading up to the finals. An Urban Circuit, in which robots will explore complex underground facilities, will take place in February 2020; and a Cave Circuit will be in August 2020. A final event incorporating all three environments is scheduled for August 2021 and will determine the winner of the competition's $2 million grand prize.The challenge will develop technologies needed by military and civilian first responders when faced with damaged underground environments suspected to be unsafe for humans.Explorer competes in the Subterranean Challenge's systems track, in which the teams develop physical robotic systems that compete in live environments. But the challenge also includes a virtual track, in which teams develop software and algorithms to compete in simulation-based events. Nine teams began the virtual competition in July, and the winners will be announced during this month's event.Explorer is one of seven teams that will receive up to $4.5 million to develop their hardware and software for the competition. The team is also sponsored by the Richard King Mellon Foundation, Schlumberger, Microsoft, Boeing, FLIR Systems, Near Earth Autonomy, Epson, Lord and Doodle Labs. Learn more about the team and sponsorships on the Explorer website.

Hebert Named Dean of Carnegie Mellon's Top-Ranked School of Computer Science

Byron Spice

Martial Hebert, a leading researcher in computer vision and robotics, has been named dean of Carnegie Mellon University's world-renowned School of Computer Science (SCS), effective August 15. Hebert, director of the Robotics Institute in SCS since 2014, will lead a school with more than 270 faculty members and approximately 2,300 students. He has been a CMU faculty member for the last 35 years. "Throughout his career, Martial Hebert has been an extraordinary and collaborative scholar who has elevated the global importance and prominence of robotics and computer science research," said CMU Provost James H. Garrett Jr. "We are so fortunate that he will now lead our School of Computer Science as dean, and we are confident that he will succeed in advancing the school's world-renowned academic, research and entrepreneurial mission." Carnegie Mellon has long shaped the discipline of computer science, establishing the pioneering Computer Science Department in 1965 and founding the nation's first computer science college more than 30 years ago. It is consistently ranked number one by U.S. News & World Report, and recently earned a No. 1 ranking in artificial intelligence (AI). In 2018, SCS introduced the first undergraduate major in AI. From the beginnings of computer science more than six decades ago, Carnegie Mellon has been known for defining the broadest possible view of the field with a focus on the greatest impact. SCS and the university have led work in developing innovative software and computational techniques by uniting leaders in computation with those in other disciplines. This has led to new fields of exploration at the intersection of computer science with biology, neuroscience, linguistics, psychology and engineering. The university has taken a leadership role in its work on cybersecurity, on the implications of computation for society and on the future of work itself. SCS spans seven departments, including the Computer Science Department, as well as areas devoted to machine learning, language technologies, human-computer interaction, computational biology, software research and robotics. It shares a multitude of joint faculty appointments and collaborations across the university, notably establishing CMU AI, a campus-wide initiative to fulfill the promise of AI across disciplines. SCS researchers have established a number of spinoff companies, such as Duolingo, Petuum, Wombat Security Technologies, Near Earth Autonomy and Astrobotic. SCS faculty and students have led the development of self-driving cars (where Hebert was a key researcher), invented computerized tutors that teach students algebra and conceived an AI program that strategically outplayed professional poker players. Approximately 50% of the school's computer science undergraduate students are female — more than double the national average. It's the highest percentage among America's top-ranked computer science schools. The university boasts 13 winners of the ACM A.M. Turing Award — considered the Nobel Prize of computer science — among its alumni and faculty. These include Alan Perlis, the first Turing Award winner; and Allen Newell and Herbert Simon, two of the founders of AI in the 1950s. "The School of Computer Science has never been better positioned to have a real-world impact than it is now," Hebert said. "As an academic institution, SCS's success relies ultimately on its students, and we are privileged to have the best ones on the planet. The breadth of our research is simply tremendous, ranging from foundational principles to transformational applications." "I treasure the Robotics Institute — I've spent my entire career there — but I can't wait to engage with all of the intellectual ideas that encompass SCS as a whole," Hebert added. "I'm excited about the opportunities for cross-collaboration across the full university." A native of France, Hebert earned a doctorate in computer science at the University of Paris. He joined the Robotics Institute in 1984, just five years after it was founded, and was named a full professor in 1999. The Robotics Institute is the world's largest robotics education and research institution. Its operating budget has increased to an all-time high, projected at nearly $90 million, since Hebert became director. A 20-member, campus-wide search committee co-chaired by Lorrie Cranor, director of CMU's CyLab Security and Privacy Institute, and Roni Rosenfeld, head of SCS's Machine Learning Department, sought input from faculty, staff and students on the new dean. With the assistance of the executive search firm Isaacson Miller, they met with highly qualified candidates with diverse backgrounds from across the country. "We spent a lot of time as a committee listening to what the campus community wanted in a dean. We heard the importance of finding a visionary leader with outstanding administrative and fundraising skills and excellent collaborative instincts — someone who is committed to education and students, as well as to diversity, equity and inclusion," Cranor said. "We found the ideal candidate — Martial — right here at CMU. We have seen that he has all the attributes we've been looking for. We're confident he will be a great dean." Rebecca W. Doerge, the Glen de Vries Dean of CMU's Mellon College of Science and member of the search committee, commended Hebert's passion and dedication to connecting computer science to all other disciplines across campus as an important factor in the university choosing him to help shape the academics and direction of SCS. "Martial Hebert possesses the essential qualities needed to build on the tremendous reputation achieved by the school in computer science," Doerge said. "Through his unwavering dedication and forward-thinking approach to interdisciplinary science and education, he will create exciting opportunities for students and researchers across the university, providing them with the foundation to make impactful discoveries in advanced technologies and become leaders in emerging fields." Upon joining the CMU faculty, Hebert became part of the Autonomous Land Vehicles program, a precursor of today's research on self-driving vehicles. He performed research on interpreting 3D data from range sensors for obstacle detection, environment modeling and object recognition. For the next three decades, he led major research programs in autonomous systems, including ground and air vehicles, with contributions in the areas of perception for environment understanding and human interaction. His research interests center on computer vision. He has led research on fundamental components, such as scene understanding, object recognition and applying machine learning to computer vision, as well as applications, which include systems that enable older adults and people with disabilities to live more independently. To meet the needs of a rapidly expanding computer vision industry, he created the nation's first master's degree program in computer vision. As director of the Robotics Institute, Hebert has led an institution with more than 800 community members, including faculty, students and staff. Now 40 years old, the institute launched the first Ph.D. program in robotics in 1990. It has spawned dozens of spinoffs and played a key role in convincing a growing number of companies, such as Google, Facebook and Caterpillar, to establish Pittsburgh offices. It also includes the National Robotics Engineering Center, which performs applied research and prototyping services for government and corporate partners. Hebert was appointed to a five-year term as dean. He succeeds Andrew Moore, who stepped down as dean at the end of 2018 to lead Google Cloud AI. Tom Mitchell, Founders University Professor, has served as the interim dean and will return to teaching and his pioneering work in machine learning. "I am grateful for Professor Mitchell's willingness to serve as interim dean for the past 10 months," said Garrett. "His leadership during this time of transition has been invaluable, allowing the School of Computer Science to continue its momentum heading into the new academic year. I also wish to thank the search committee, particularly Professor Cranor and Professor Rosenfeld, for their dedication and diligence during the entire process." Hebert will continue to direct the Robotics Institute until an interim director is named.

Amazon Web Services Teams With Pittsburgh Health Data Alliance to Improve Care

Wendy Zellner, UPMC

The Pittsburgh Health Data Alliance (PHDA) has announced that it is working closely with Amazon Web Services (AWS), an Amazon.com company, through a machine learning research sponsorship, to advance innovation in areas such as cancer diagnostics, precision medicine, voice-enabled technologies and medical imaging. A unique consortium formed four years ago by UPMC, the University of Pittsburgh and Carnegie Mellon University, the PHDA uses the big data generated in health care — including patient information in the electronic health record, diagnostic imaging, prescriptions, genomic profiles and insurance records — to transform the way diseases are treated and prevented, and to better engage patients in their own care. New machine learning technologies and advances in computing power, like those offered by Amazon SageMaker and Amazon EC2, make it possible to rapidly translate insights discovered in the lab into treatments and services that could dramatically improve human health. Through the AWS Machine Learning Research sponsorship, PHDA scientists from both Pitt and CMU expect to accelerate research and product commercialization efforts across eight projects, enabling doctors to better predict the course of a person's disease and response to treatment; use a patient's verbal and visual cues to diagnose and treat mental health symptoms; and reduce medical diagnostic errors by mining all the data in a patient's medical record. Data is secure, anonymized and stays with PHDA institutions. A CMU team led by Russell Schwartz, professor of biological sciences and computational biology, and Jian Ma, associate professor of computational biology, will use AWS support to develop algorithms and software tools to better understand the origin and evolution of tumor cells. This project will use machine learning to gain insights into how tumors develop and to predict how they are likely to change and grow in the future. "Data-driven, genomic methods guided by an understanding of cancers as evolutionary systems have relevance to numerous aspects of clinical cancer care," Schwartz said. "These include determining which precancerous lesions are likely to become cancers, which cancers have a good or bad prognosis, and which of those with bad prognoses might respond long-term to specific therapies." At Pitt, researcher David Vorp and his team will use AWS resources to improve the diagnosis and treatment of abdominal aortic aneurysms, the 13th-leading cause of death in western countries. "With the latest advances in machine learning, we are developing an algorithm that will provide clinicians with an objective, predictive tool to guide surgical interventions before symptoms appear, improving patient outcomes," said Vorp, associate dean for research at Pitt's Swanson School of Engineering and the John A. Swanson Professor of Bioengineering. For more information, read the full announcement on the PHDA website.

Storytelling Bots Learn To Punch Up Their Last Lines

Byron Spice

Nothing disappoints quite like a good story with a lousy finish. So researchers at Carnegie Mellon University who work in the young field of automated storytelling don't think they're getting ahead of themselves by devising better endings. The problem is that most algorithms for generating the end of a story tend to favor generic sentences, such as "They had a great time," or "He was sad." Those may be boring, but Alan Black, a professor in CMU's Language Technologies Institute, said they aren't necessarily worse than a non sequitur such as "The UFO came and took them all away." In a paper presented Thursday, Aug. 1, at the Second Workshop of Storytelling in Florence, Italy, Black and students Prakhar Gupta, Vinayshekhar Bannihatti Kumar and Mukul Bhutani presented a model for generating endings that will be both relevant to the story and diverse enough to be interesting. One trick to balancing these goals, Black said, is to require the model to incorporate some key words into the ending that are related to those used early in the story. At the same time, the model is rewarded for using some rare words in the ending, in hopes of choosing an ending that is not totally predictable. Consider this bot-generated story: "Megan was new to the pageant world. In fact, this was her very first one. She was really enjoying herself, but was also quite nervous. The results were in and she and the other contestants walked out." Existing algorithms generated these possible endings: "She was disappointed the she couldn't have to learn how to win," and "The next day, she was happy to have a new friend." The CMU algorithm produced this ending: "Megan won the pageant competition." None of the selections represent deathless prose, Black acknowledged, but the endings generated by the CMU model scored higher than the older models both when scored automatically and by three human reviewers. Researchers have worked on conversational agents for years, but automated storytelling presents new technical challenges. "In a conversation, the human's questions and responses can help keep the computer's responses on track," Black said. "When the bot is telling a story, however, that means it has to remain coherent for far longer than it does in a conversation." Automated storytelling might be used for generating substories in videogames, Black said, or for generating stories that summarize presentations at a conference. Another application might be to generate instructions for repairing something or using complicated equipment that can be customized to a user's skill or knowledge level, or to the exact tools or equipment available to the user.