hero image
Gareth James - Emory University, Goizueta Business School. Atlanta, GA, US

Gareth James Gareth James

John H. Harland Dean and Professor of Information Systems & Operations Management | Emory University, Goizueta Business School

Atlanta, GA, UNITED STATES

Data is the sword of the 21st century, those who wield it well, the Samurai. -Jonathan Rosenberg, adviser to Larry Page & former Google SVP

Media

Publications:

Gareth James Publication

Documents:

Photos:

Videos:

Audio/Podcasts:

Social

Biography

Gareth James became the John H. Harland Dean of Goizueta Business School in July 2022. Renowned for his visionary leadership, statistical mastery, and commitment to the future of business education, James brings vast and versatile experience to the role. His collaborative nature and data-driven scholarship offer fresh energy and focus aimed at furthering Goizueta’s mission: to prepare principled leaders to have a positive influence on business and society.

James is a dynamic scholar and leader. His extensive published works include numerous articles, conference proceedings, and book chapters focused on statistical and machine learning methodologies. James is also co-author of the extremely successful textbook, An Introduction to Statistical Learning.

James brings a powerful optimism and contagious enthusiasm to further the work Goizueta is doing, not only through the school’s stellar scholarship, but also by continuing to build strong bridges to the business community. He believes in the central role that business plays in society and the impact that Goizueta has in preparing the thinkers and innovators of tomorrow. His ambition to drive excellence and strengthen Goizueta’s future is fueled by his experience in data-informed decision making, strategy, and support.

James joins the Goizueta family from the Marshall School of Business where he served in a multitude of pivotal roles. While interim dean (2019-2020), he led the school’s COVID-19 response. He served as vice dean of faculty, as well as deputy dean (2020-2022), a position created to retain him at school-level leadership.

A noted researcher, his work has been cited more than 20,000 times. James has led multiple National Science Foundation research grants and has served as an associate editor for five top research journals. The recipient of two Dean’s Research Awards from the Marshall School of Business, he is a life member, and elected Fellow, of the American Statistical Association and the Institute of Mathematical Statistics.

James is also a superb teacher and mentor. In addition to the Evan C. Thompson Faculty Teaching and Learning Innovation Award, he is a three-time winner of the Marshall School of Business’ Golden Apple Award for best instructor in the full-time MBA program. He has also been awarded Marshall and USC’s highest honors for mentoring junior colleagues and graduate students, including the Dean’s Ph.D. Advising, USC Mellon, Evan C. Thompson and Provost’s Mentoring awards.

Education (3)

Stanford University: PhD, Statistics

The University of Auckland: BComm, Finance

The University of Auckland: BS, Statistics

Areas of Expertise (4)

Statistical Problems in Marketing

Functional Data Analysis

Statistical Methodology

High Dimensional Regression

Publications (7)

Asymmetric error control under imperfect supervision: a label-noise-adjusted Neyman-Pearson umbrella algorithm

Journal of the American Statistical Association

S Yao, B Rava, X Tong, G James

2021-12-09

Label noise in data has long been an important problem in supervised learning applications as it affects the effectiveness of many widely used classification methods. Recently, important real-world applications, such as medical diagnosis and cybersecurity, have generated renewed interest in the Neyman–Pearson (NP) classification paradigm, which constrains the more severe type of error (e.g., the Type I error) under a preferred level while minimizing the other (e.g., the Type II error).

view more

Leapfrogging, Cannibalization, and Survival During Disruptive Technological Change: The Critical Role of Rate of Disengagement

Journal of Marketing

D Chandrasekaran, GJ Tellis, GM James

2020-12-17

When faced with new technologies, the incumbents’ dilemma is whether to embrace the new technology, stick with their old technology, or invest in both. The entrants’ dilemma is whether to target a niche and avoid incumbent reaction or target the mass market and incur the incumbent’s wrath. The solution is knowing to what extent the new technology cannibalizes the old one or whether both technologies may exist in tandem. The authors develop a generalized model of the diffusion of successive technologies, which allows for the rate of disengagement from the old technology to differ from the rate of adoption of the new. The model helps managers estimate evolving proportions of segments that play different roles in the competition between technologies and predict technological leapfrogging, cannibalization, and coexistence.

view more

Heteroscedasticity-Adjusted Ranking and Thresholding for Large-Scale Multiple Testing

Journal of the American Statistical Association

L Fu, B Gang, GM James, W Sun

2020-12-04

Standardization has been a widely adopted practice in multiple testing, for it takes into account the variability in sampling and makes the test statistics comparable across different study units. However, despite conventional wisdom to the contrary, we show that there can be a significant loss in information from basing hypothesis tests on standardized statistics rather than the full data.

view more

Irrational exuberance: Correcting bias in probability estimates

Journal of the American Statistical Association

GM James, P Radchenko, B Rava

2020-08-17

We consider the common setting where one observes probability estimates for a large number of events, such as default risks for numerous bonds. Unfortunately, even with unbiased estimates, selecting events corresponding to the most extreme probabilities can result in systematically underestimating the true level of uncertainty. We develop an empirical Bayes approach “excess certainty adjusted probabilities” (ECAP), using a variant of Tweedie’s formula, which updates probability estimates to correct for selection bias.

view more

Penalized and constrained optimization: an application to high-dimensional website advertising

Journal of the American Statistical Association

GM James, C Paulson, P Rusmevichientong

2019-06-19

Firms are increasingly transitioning advertising budgets to Internet display campaigns, but this transition poses new challenges. These campaigns use numerous potential metrics for success (e.g., reach or click rate), and because each website represents a separate advertising opportunity, this is also an inherently high-dimensional problem. Further, advertisers often have constraints they wish to place on their campaign, such as targeting specific sub-populations or websites. These challenges require a method flexible enough to accommodate thousands of websites, as well as numerous metrics and campaign constraints. Motivated by this application, we consider the general constrained high-dimensional problem, where the parameters satisfy linear constraints. We develop the Penalized and Constrained optimization method (PaC) to compute the solution path for high-dimensional, linearly constrained criteria.

view more

Efficient large-scale internet media selection optimization for online display advertising

Journal of Marketing Research

C Paulson, L Luo, GM James

2018-08-01

In today's digital market, the number of websites available for advertising has ballooned into the millions. Consequently, firms often turn to ad agencies and demand-side platforms (DSPs) to decide how to allocate their Internet display advertising budgets. Nevertheless, most extant DSP algorithms are rule-based and strictly proprietary. This article is among the first efforts in marketing to develop a nonproprietary algorithm for optimal budget allocation of Internet display ads within the context of programmatic advertising. Unlike many DSP algorithms that treat each ad impression independently, this method explicitly accounts for viewership correlations across websites.

view more

Functional additive regression

The Annals of Statistics

Y Fan, GM James, P Radchenko

2015-10-01

We suggest a new method, called Functional Additive Regression, or FAR, for efficiently performing high-dimensional functional regression. We demonstrate that FAR can be implemented with a wide range of penalty functions using a highly efficient coordinate descent algorithm. Theoretical results are developed which provide motivation for the FAR optimization criterion. Finally, we show through simulations and two real data sets that FAR can significantly outperform competing methods.

view more

Working Papers/Projects (3)

Functional Data Analysis

The key tenet of this Functional Data Analysis (FDA) is to treat the measurements of a function or curve not as multiple data points but as a single observation of the function as a whole. This approach allows one to more fully exploit the structure of the data. FDA is an inherently multidisciplinary area and is becoming increasingly important as technological changes make it more common to observe functional data. A common FDA situation involves a functional regression problem where one might observe a response Y and a functional predictor X(t) which is measured over time or some other domain. The goal would then be to build a model to predict Y based on X(t). Fitting such a model is more challenging than for standard linear regression because the predictor is now an infinite dimensional object.

High Dimensional Regression

Traditionally statistics has involved getting information from relatively small data sets involving perhaps on the order of a hundred observations and ten predictors or independent variables. However, recent technological advances in areas as diverse as web based advertising, finance, supermarket bar code readers (linked to customer cards) and even micro-arrays in genetics, have led to an entirely new type of data called High Dimensional Data. This data typically has anywhere from ten to a few hundred observations but possibly up to tens of thousands of variables. Dealing with such data poses very significant statistical and computational challenges. Trying to find the one or two important variables among thousands with only say 10 observations is roughly analogous to the traditional “finding a needle in a hay stack” with the added challenge that you only get 10 guesses before your time is up. HDD has become one of the most important areas of research in statistics.

Statistical Problems in Marketing

There are many interesting statistical problems in the marketing field. My major goal here is to incorporate new ideas from the statistical literature to provide solutions to practical marketing problems. For example, in one paper I used methods from the functional data analysis literature to accurately predict market penetration of 21 new products over 70 different countries. In another paper my coauthors and I suggested a new statistical methodology for predicting the trajectory of new technologies over time. We collected data over time for an extensive set of technologies and showed that our approach was overall more accurate than well known laws such as Moore's Law.

In the News (2)

Emory Goizueta’s New Dean Wants To Take The B-School From ‘Strong’ To ‘Premier’

Poets & Quants  online

2022-02-28

Gareth James will officially take the helm at the Goizueta School in July after 24 years at USC Marshall. He takes over a program with “impressive ambitions,” one “whose very name represents an important legacy for both Emory and the Atlanta region.”

view more

Emory names new dean of Goizueta Business School

Atlanta Business Chronicle  

2022-02-14

Gareth James has been named the new dean of Emory University's Goizueta Business School.

view more