Skip to main content

2020 Publications

A Bayesian Hierarchical Model for Evaluating Forensic Footwear Evidence
Neil A. Spencer and Jared Murray ; Annals of Applied Statistics, 14(3) 1449-1470

When a latent shoeprint is discovered at a crime scene, forensic analysts inspect it for distinctive patterns of wear such as scratches and holes (known as accidentals) on the source shoe's sole. If its accidentals correspond to those of a suspect's shoe, the print can be used as forensic evidence to place the suspect at the crime scene. The strength of this evidence depends on the random match probability---the chance that a shoe chosen at random would match the crime scene print's accidentals. Evaluating random match probabilities requires an accurate model for the spatial distribution of accidentals on shoe soles. A recent report by the President's Council of Advisors in Science and Technology criticized existing models in the literature, calling for new empirically validated techniques. We respond to this request with a new spatial point process model for accidental locations, developed within a hierarchical Bayesian framework. We treat the tread pattern of each shoe as a covariate, allowing us to pool information across large heterogeneous databases of shoes. Existing models ignore this information; our results show that including it leads to significantly better model fit. We demonstrate this by fitting our model to one such database.

 

A Model of Fake Data in Data-driven Analysis
A Model of Fake Data in Data-driven Analysis. Journal of Machine Learning Research 21, 1-26.

Data-driven analysis has been increasingly used in various decision making processes. With more sources, including reviews, news, and photos, that can now be used for data analysis, the authenticity of data sources is in doubt. While previous literature attempted to detect fake data piece by piece, in the current work, we try to capture the fake data sender's strategic behavior to detect the fake data source. Specifically, we model the tension between a data receiver who makes data-driven decisions and a fake data sender who benefits from misleading the receiver. We propose a potentially infinite horizon continuous time game-theoretic model with asymmetric information to capture the fact that the receiver does not initially know the existence of fake data and learns about it during the course of the game. We use point processes to model the data traffic, where each piece of data can occur at any discrete moment in a continuous time flow. We fully solve the model and employ numerical examples to illustrate the players' strategies and payoffs for insights. Specifically, our results show that maintaining some suspicion about the data sources can be very helpful to the data receiver.

 

A New Class of Time Dependent Latent Factor Models with Applications
A New Class of Time Dependent Latent Factor Models with Applications. Journal of Machine Learning Research 21(26-47), 1-24.

In many applications, observed data are influenced by some combination of latent causes. For example, suppose sensors are placed inside a building to record responses such as temperature, humidity, power consumption and noise levels. These random, observed responses are typically affected by many unobserved, latent factors (or features) within the building such as the number of individuals, the turning on and off of electrical devices, power surges, etc. These latent factors are usually present for a contiguous period of time before disappearing; further, multiple factors could be present at a time. This paper develops new probabilistic methodology and inference methods for random object generation influenced by latent features exhibiting temporal persistence. Every datum is associated with subsets of a potentially infinite number of hidden, persistent features that account for temporal dynamics in an observation. The ensuing class of dynamic models constructed by adapting the Indian Buffet Process --- a probability measure on the space of random, unbounded binary matrices --- finds use in a variety of applications arising in operations, signal processing, biomedicine, marketing, image analysis, etc. Illustrations using synthetic and real data are provided.

 

Advanced Technology and End-Time in Organizations: A Doomsday for Collaborative Creativity?
Sirkka L. Jarvenpaa and Liisa Välikangas; Academy of Management Perspectives, 34(4) 566-584

Our capacity to tackle grand challenges facing humanity depends on collaborative creativity. Increasingly, such collaborative creativity is affected by advanced technology such as mobile technology, virtual communications, and algorithmic computing. We use a temporal lens to study the potential of advanced technology to influence collaborative creativity. Prior studies have found that inner time and social time are critical for collaborative creativity. To creatively and purposefully contribute to collaboration, inner time—a temporal capacity to reflect on actions, meaning, and consequences over time—is required. Also necessary is social time—the time spent with others—to practice giving and taking of multivocal ideas and perspectives. What has not been well scrutinized in the organization and management literature is whether advanced technology might suppress both inner time and social time. In this paper, we advance future-oriented conjectures on the potential role of advanced technology on such temporal capacity. Included in our projections is a futuristic doomsday in which advanced technology has extinguished inner time and social time and hence curtailed collaborative creativity. We advance policy considerations for avoiding such an “end-time” scenario in organizations and societies.

 

All eyes on you: The social audience and hedonic adaptation
Sunaina K. Chugani and Sunaina K. Chugani; Psychology and Marketing, 37(11) 1554-1570

Marketers have a keen interest in keeping customers happy past the point of product acquisition. However, consumer happiness with products typically declines over time, a process called “hedonic adaptation.” Understanding this process is essential for managing consumers' post‐acquisition experiences, and yet marketers have not explored how the ubiquitous social environment influences hedonic adaptation. We explore the effect of a social audience (i.e., the presence of others and the perception that those others are noticing you) on adaptation to positive products using two real‐world studies and one lab study. We show that a social audience can slow hedonic adaptation by cuing consumers to believe that others are admiring their product. This perceived admiration, in turn, helps consumers see the product through fresh, unadapted eyes. These findings help clarify the role of the consumption environment in adaptation, help explain why product happiness can vary by consumer over time, and show that the effects of social forces do not always occur at the moment of product acquisition.

 

Can You Be Too Well Connected?
Ethan R. Burris, , Dawn Klinghoffer, Elizabeth McCune, and Tannaz Sattari Tabrizi; Harvard Business Review Digital Articles, ()

The benefits and pitfalls of networking have never been quantified. A team of researchers examined the meeting schedules and emails of Microsoft employees to find the employees with the most and least amount of connectedness. Well-connected employees were more engaged and more likely to speak up about issues at work. But the researchers were surprised to also find several downsides of being well-connected. Well-connected employees are less likely to engage in actions that would upset their hard-earned relationships. Furthermore, they were 16% less likely to be satisfied with their work-life balance and 20% less likely to think that their workload allowed them to achieve an acceptable work-life balance. The authors suggest several steps companies can take to guard work-life balance and encourage even networked employees to blow the whistle on problems that they see in their company.

 

A Project-Level Analysis of Value Creation in Firms
A Project-level Analysis of Value Creation in Firms. Financial Management 49(2), 423-446.

This paper analyzes value-creation in firms at the project level. We present evidence that managers facing short-termist incentives set a lower threshold for accepting projects. Using novel data on new client and product announcements in both the U.S. and international markets, we find that the market responds less positively to a new project announcement when the firm’s managers have incentives to focus on short-term stock price performance. Furthermore, textual analysis of project announcements show that firms with short-termist CEOs use more vague and generically positive language when introducing new projects to the marketplace.

 

aPRIDIT Unsupervised Classification with Asymmetric Valuation of Variable Discriminatory Worth
Linda L. Golden, Patrick L. Brockett, Montserrat Guillen, and Danae Manika; Multivariate Behavior Research, 55(5) 685-703

Sometimes one needs to classify individuals into groups, but there is no available grouping information due to social desirability bias in reporting behavior like unethical or dishonest intentions or unlawful actions. Assessing hard-to-detect behaviors is useful; however it is methodologically difficult because people are unlikely to self-disclose bad actions. This paper presents an unsupervised classification methodology utilizing ordinal categorical predictor variables. It allows for classification, individual respondent ranking, and grouping without access to a dependent group indicator variable. The methodology also measures predictor variable worth (for determining target behavior group membership) at a predictor variable category-by-category level, so different variable response categories can contain different amounts of information about classification. It is asymmetric in that a “0” on a binary predictor does not have a similar impact toward signaling “membership in the target group” as a “1” has for signaling “membership in the non-target group.” The methodology is illustrated by identifying Spanish consumers filing fraudulent insurance claims. A second illustration classifies Portuguese high school student’s propensity to alcohol abuse. Results show the methodology is useful when it is difficult to get dependent variable information, and is useful for deciding which predictor variables and categorical response options are most important.

 

Are Online Reviews of Physicians Reliable Indicators of Clinical Outcomes? A Focus on Chronic Disease Management
Danish H. Saifee, Zhiqiang (Eric) Zheng, Indranil R. Bardhan, and Atanu Lahiri; Information Systems Research, 31(4) 1282-1300

Current trends on patient empowerment indicate that patients who play an active role in managing their health also seek and use information obtained from online reviews of physicians. However, it is far from certain whether patient-generated online reviews accurately reflect the quality of care provided by physicians, especially in the context of chronic disease care. Because chronic diseases require continuous care, monitoring, and multiple treatments over extended time periods, it can be quite hard for patients to assess the effectiveness of a particular physician accurately. Given this credence nature of chronic disease care, the research question is the following: what is the information value associated with online reviews of physicians who treat chronic disease patients? We address this issue by examining the link between online reviews of physicians and their patients’ actual clinical outcomes based on a granular admission–discharge data set. Contrary to popular belief, our study finds that there is no clear relationship between online reviews of physicians and their patients’ clinical outcomes, such as readmission risk or emergency room visits. Our findings have two major implications: (a) online reviews may not be helpful in the context of healthcare services with credence aspects; (b) because treatments of chronic diseases have more credence good characteristics when compared with surgeries or other acute care services, one should not extrapolate research on surgeries and acute care services to chronic disease care. Rather, one should acquire a better understanding of the information conveyed in online reviews regarding a physician’s ability to deliver certain clinical outcomes before drawing inferences. Our findings have important ramifications for all stakeholders including hospitals, physicians, patients, payers, and policymakers.

 

Branching and Anchoring: Complementary Asset Configurations in Conditions of Knightian Uncertainty
Curba Morris Lampert, Minyoung Kim, and Francisco Polidoro; Academy of Management Review, 45(4) 847-868

The role of complementary assets across the different stages of a firm’s value chain in facilitating value creation and value appropriation from technological innovation remains a key area of interest in strategy and entrepreneurship research. However, current thinking on complementary assets operates with an unstated boundary condition—that relevant assets and asset configurations are relatively well known to the innovating firm. This assumption is applicable under conditions of “risk,” wherein decision-makers can know outcomes and probabilities. It is less clear, though, how current insights apply under conditions of “Knightian uncertainty,” in which neither outcomes nor probabilities are knowable. The purpose of this paper is to advance a complementary assets theory that accounts for conditions of Knightian uncertainty, thus aligning theory with the contemporary realities surrounding innovating firms. This article highlights an important intertemporal trade-off that existing literature ignores—without accounting for Knightian uncertainty, firms may unknowingly direct complementary assets in ways that favor current value appropriation at the expense of future value creation. We discuss theoretical implications for research on the microfoundations of dynamic capabilities and opportunities for future research on complementary asset configurations across geographic boundaries and across organizations in innovation ecosystems.

 

Characterizing the Hedging Policies of Commodity Price-Sensitive Corporations
Raphael H. Boroumand, Stephane Goutte, and Ehud I. Ronn; Journal of Futures Markets, 40(8): 1264-1281

Many corporations in the developed world face price and quantity uncertainty in commodities for which there are traded assets -- futures and options contracts -- which permit these corporations to hedge the risk to which they are exposed. Finance research has demonstrated frictions in capital markets are equivalent to risk-averse decision-making: Accordingly, decision-makers may make optimal decisions based on a trade-off between risk and return. 

Theory indicates the optimal hedging program hinges on the amount of exposure the corporation wishes to hedge: As risk aversion increases the company's optimal hedge proceeds from no-hedging, to acquiring options, then to replacing options with futures contracts. Using data from the CFTC as well as gold companies, this paper provides an empirical test of whether corporations' hedge ratios are consistent with such optimal management of risk exposure using futures and/or options.

 

Collusion in Markets with Syndication
John W. Hatfield, Scott Duke Kominers, Richard Lowery , and Jordan M. Barry; Journal of Political Economy, 128(10) 3779-3819

Markets for IPOs and debt issuances are syndicated, in the sense that a bidder who wins a contract may invite losing bidders to join a syndicate that together fulfills the contract. We show that in markets with syndication, standard intuitions from industrial organization can be reversed: Collusion may become easier as market concentration falls, and market entry may in fact facilitate collusion. In particular, price collusion can be sustained by a strategy in which firms refuse to join the syndicate of any firm that deviates from the collusive price. Our results thus can rationalize the apparently contradictory empirical facts that the market for IPO underwriting exhibits seemingly collusive pricing despite its low level of market concentration.

 

Communicating Assurance Using Practitioner-Customized Procedures: An Experiment and Emerging Research Opportunities
Sandra C. Vera-Munoz, Lisa Milici Gaynor, and William R. Kinney; A Journal of Practice & Theory, 39(4) 201-222

Traditionally, financial and nonfinancial information assurance standards have specified either “high” assurance based on “sufficient evidence” or “moderate” assurance based on analytical procedures and inquiries. Recently, in response to rapidly growing nonfinancial assurance demand, the IAASB extended the possible range of limited assurance by allowing practitioner-customized procedure descriptions and assuming diverse users can “appreciate” the varying reliability achieved. To test the validity of this policy change, we examine report users' confidence judgments for a GHG emissions assurance report using combinations of report attributes: critical practitioner-customized procedure descriptions, conclusion frame, and engagement label. We find that, consistent with an “assurance communication gap,” including or explicitly excluding a practitioner-customized procedure deemed essential-for-reasonable assurance does not significantly affect users' confidence judgments. However, we find that both positive conclusion frames and reasonable assurance engagement labels incrementally enhance confidence judgments. We outline research and practice opportunities related to emerging policy prescriptions involving practitioner-customized procedures.

 

Corporate Strategy Changes and Information Technology Control Effectiveness in Multibusiness Firms
Huseyin Tanriverdi and Kui Du; MIS Quarterly, 44(4) 1573-1617

 

Does Public Country-by-Country Reporting Deter Tax Avoidance and Income Shifting? Evidence from the European Banking Industry
Terry Shevlin and Aruhn Venkat; Contemporary Accounting Research

In this study, we examine the effect of increased tax transparency on the tax planning behavior of European banks. In 2014, the European Union introduced public country-by-country reporting requirements to the banking industry. Treating this new requirement as an exogenous shock, we find limited evidence consistent with a decline in income-shifting by the banks’ financial affiliates in the post-adoption period (starting from 2015). We do not, however, find robust evidence of a significant change in the consolidated book effective tax rates among the affected banks. Our findings suggest that increased transparency from public country-by-country reporting can deter tax-motivated income shifting but that it did not appear to materially influence a bank’s overall tax avoidance. Our findings have policy implications for the ongoing debate between the European Parliament, the Organisation for Economic Co-operation and Development, and accounting standard-setting bodies on whether to require multinationals to publish country-by-country reports.

 

How Reliably Do Empirical Tests Identify Tax Avoidance?
Lisa De Simone, Jordan Nickerson, Jeri Seidman, and Bridget Stomberg; Contemporary Accounting Research, 37(3) 1536-1561

Research on the determinants of tax avoidance have relied on tests using GAAP and cash effective tax rates (ETRs) and total and permanent book‐tax differences. Two new proxies have emerged that overcome documented limitations of these proxies: one, developed by Henry and Sansing (2018), allows for more meaningful interpretation of results estimated in samples that include loss observations. The other, reserves for unrecognized tax benefits (UTB), provides new data on tax uncertainty. We offer empirical evidence on how well tests using these new proxies perform relative to those extensively used in prior research. The paper finds that tests using the proxy developed by Henry and Sansing (2018) have lower power relative to those using other proxies across all samples, including a sample that includes loss observations. In contrast, when firms accrue reserves for uncertain tax avoidance, tests using the current‐year addition to the UTB have the highest power across all proxies, samples, and levels of reserves. In the absence of reserves, tests using the GAAP ETR best detect uncertain tax avoidance, on average. This study contributes to the literature by using a controlled environment to provide the first large‐scale empirical evidence on how the power of tests varies with the use of relatively new proxies, the inclusion of loss observations, and the advent of FIN 48.

 

Imperfect Quality Certification in Lemons Markets
Birendra K. Mishra, Ashutosh Prasad, and Vijay Mahajan; Theoretical Economics Letters , 10(6) 1260-1275

In markets with information asymmetry, the seller of a high-quality product is unable to credibly communicate its quality to buyers and is forced to price like an average quality seller. This is a disincentive to provide quality and high-quality sellers may exit the market. Of several methods to reduce information asymmetry, we provide an analytical study of certification or grading of quality levels by infomediaries. In the equilibrium of a quality reporting game, we find that certification reduces, but does not eliminate, the problems of information asymmetry. There exists a threshold, determined by the accuracy of the certification process, below which customers should believe quality reports, but disbelieve reports above it. We further examine a two-category scheme of high/low quality certification and discuss the design of certification grades using an entropy approach.

 

Last-Mile Shared Delivery: A Discrete Sequential Packing Approach
Junyu Cao, Mariana Olvera-Cravioto, and Zuo-Jun Shen; Mathematics of Operations Research, 45(4) 1466-1497

We propose a model for optimizing the last-mile delivery of n packages from a distribution center to their final recipients, using a strategy that combines the use of ride-sharing platforms (e.g., Uber or Lyft) with traditional in-house van delivery systems. The main objective is to compute the optimal reward offered to private drivers for each of the n packages such that the total expected cost of delivering all packages is minimized. Our technical approach is based on the formulation of a discrete sequential packing problem, in which bundles of packages are picked up from the warehouse at random times during the interval [0,T]. Our theoretical results include both exact and asymptotic (as n -> infinity) expressions for the expected number of packages that are picked up by time T. They are closely related to the classical Rényi’s parking/packing problem. Our proposed framework is scalable with the number of packages.

 

Matching Mobile Applications for Cross Promotion
Gene Moo Lee, Shu He, Joowon Lee, and Andrew B. Whinston; Information Systems Research, 31(3) 865-891

The mobile app market is one of the most successful software markets. As the platform grows rapidly, with millions of apps and billions of users, search costs are increasing tremendously. The challenge is how app developers can target the right users with their apps and how consumers can find apps that fit their needs. Cross promotion, advertising a mobile app (target app) in another app (source app), is introduced as a new app promotion framework to alleviate the issue of search costs. In this paper, we model source app user behaviors (downloads and post-download usage) with respect to different target apps in cross-promotion campaigns. We construct a novel app similarity measure using LDA topic modeling on apps’ production descriptions, and then analyze how the similarity between the source and target apps influences users’ app download and usage decisions. To estimate the model, we use a unique data set from a large-scale random matching experiment conducted by a major mobile advertising company in Korea. The empirical results show that consumers prefer more diversified apps when they are making download decisions compared with their usage decisions, which is supported by the psychology literature on people’s variety-seeking behavior. Lastly, we propose an app-matching system based on machine learning models (on app download and usage prediction) and generalized deferred acceptance algorithms. The simulation results show that app analytics capability is essential in building accurate prediction models and in increasing ad effectiveness of cross promotion campaigns, and that, at the expense of privacy, individual user data can further improve the matching performance. The paper has implications on the tradeoff between utility and privacy in the growing mobile economy.

 

Modeling stochastic mortality for joint lives through subordinators
Yuxin Zhang and Patrick L. Brockett; Insurance: Mathematics & Economics, 95() 166-172

There is a burgeoning literature on mortality models for joint lives. In this paper, we propose a new model in which we use time-changed Brownian motion with dependent subordinators to describe the mortality of joint lives. We then employ this model to estimate the mortality rate of joint lives in a well-known Canadian insurance data set. Specifically, we first depict an individual’s death time as the stopping time when the value of the hazard rate process first reaches or exceeds an exponential random variable, and then introduce the dependence through dependent subordinators. Compared with existing mortality models, this model better interprets the correlation of death between joint lives, and allows more flexibility in the evolution of the hazard rate process. Empirical results show that this model yields highly accurate estimations of mortality compared to the baseline non-parametric (Dabrowska) estimation.