2020 hotel tech issues: The perils of automation

0

by BRUNA PEDRINI

The intensity and urgency to provide for the best engagement and personalized experience has changed the landscape of the hospitality industry. Much of the hospitality industry is catapulting toward the “smart hotel” model, allowing travelers to control their hotel experience from a personal device, from booking and digital check-in; complete control of guestroom smart devices, hotel concierge chat assistance, digital check-out all from a single-unified, user-friendly app. Unfortunately, this is often done without considering the legal and human impact.

In a business focused on enhanced experiences, the lure of automation is very real. Prospective guests shop competing websites for the best deal, vendors promise the latest and greatest experience through use of their products and technologies, while people learn of employment opportunities and apply through websites and social media. Meanwhile, online reviews can build or kill a business, and regulators and plaintiffs search the internet for compliance and liability issues from thousands of miles away. This article calls for people in the hospitality industry to review and consider how to use technology without being trampled by unintended consequences. The following list is a start of the areas to evaluate; it is by no means exhaustive.

ALGORITHMIC BIAS
First things first: Automation’s strength is also its weakness. Algorithms are quantitative data – a process or set of rules involving mathematical calculations that produce more data that helps people make decisions. Algorithmic bias (machine learning bias) or AI bias, is a systematic error in the coding, collection, or selection of data that produces unintended or unanticipated discriminatory results. Algorithmic bias is perpetuated by data scientists who train algorithms based on patterns found in historical data. With algorithms, the saying “garbage in, garbage out,” is more real than ever. Although results produced by an algorithm may have the patina of objectivity, the real or implicit biases of the coders is baked into the product.

This means the recruiting tool a hotel may use to gather résumés, sort applicants, and assure non-discriminatory employment practices may not rate candidates neutrally when it comes to gender, race, disability, or religion. There need be no intent to discriminate by the business, but the end result may be discriminatory. Amazon recently walked away from their recruiting tool because the AI models used were trained on résumé data from the prior 10 years, which, due to historical patterns, were comprised primarily of white men. As a result of the data, the machine understood that male candidates were preferable, thus unwittingly perpetuating historical bias.

Algorithmic bias arises from the thousands of implicit and intentional biases of the persons collecting data, coding, and developing programs and products. The good news is that there are solutions: on a very basic level, recognizing that all persons use implicit biases in decision-making frees the creators of algorithms to identify, test, and limit the impact of their biases. Once an individual is aware of a bias, they can act to counter and neutralize its negative impact. As consumers of the end product, members of the hospitality industry can use their purchasing power to ask questions about the product and how it was developed. Contracts with vendors can include clauses requiring the seller to have a diverse workforce; contracts can require apps and products be built accessible and in compliance with federal and state standards. And most importantly, products can be evaluated in practice. Ultimately, it is the hotel owner’s brand and business on the line.

FACIAL RECOGNITION AND FACIAL ANALYSIS
Facial recognition technology uses an automated process to analyze faces captured in images and video to identify or confirm the identity of individuals. Facial analysis and facial recognition are different in terms of the underlying technology and the data used to train them. The issues for commercial entities, however, are similar and numerous. Both systems, for example, tend not to identify all races and national origins equally. Studies have shown that facial recognition systems have a harder time identifying women and darker-skinned people, leading to false positives. Other systems have difficulty “seeing” persons of color and specific difficulty identifying the gender of dark-skinned women compared with white men such that the gender of 35% of dark-skinned women was misidentified compared with 1% of light-skinned men. Similarly, facial recognition systems do not perform well for persons with certain disabilities.

Hotels already use facial recognition for check-in, entering rooms and entertainment facilities, and cashless payments. Legally, this means that facial recognition/ facial analysis should not be standalone solutions. Companies need to consider their processes to assure inclusion. Hotels that use facial recognition for financial transactions or to verify identity or age should have live back-up support onsite and readily available. Fun as the YouTube video, “How to Pour Beer with your Face” sounds, a thoughtful, inclusive approach will promote the fun without the litigation. And last, but not least, facial recognition used by private industry must be considered in terms of data privacy requirements.

TAILORED ADVERTISING AND MARKETING
Targeted marketing can raise civil rights concerns. Tailored advertising “should” not take into account protected-class status, yet the information collected on individual guests often can be correlated and sorted to detrimental effect. Hotels seek broad-ranging information such as identification (name, address, gender, etc.), preferences (high floor, firm pillows, special needs) and lodging history, behavioral information, such as dining preferences, and spa appointments, with details like frequency, time, and amounts spent. The questions abound: If the marketing one guest receives is different from that received by another, what is the basis for the distinction? Is one person or group receiving “better” treatment? Are the offers based on specifics related to that guest’s individual habits or part of an estimate based on certain demographics including protected-class data?

WEBSITES AND APPS
American with disabilities spend more than $17 billion each year on travel. Hotel websites, apps, and service accessibility is a necessity for would-be travelers browsing the hotel website from home as well as for hotel visitors who browse the site via smartphone or laptop. People with disabilities often also travel with additional assistive devices, and hotel websites should be able to support these types of devices. Hotel websites and apps should also be designed and maintained to ensure that all guests can properly conduct necessary travel planning, booking, and transactions online. Several large brands have developed lists that include whether entranceways, reception areas, and other important parts of the hotel are level areas without steps; whether gyms, conference, or business centers are accessible; and whether all hotel room and external bathrooms are fully accessible. Finally, hotels should also have accessibility policies that detail the scope and outline of their commitments to meet the requirements of the WCAG 2.0 AA standard.

CONCLUSION
Whether for front desk or automated checkout, technology in guestrooms, accessibility, marketing, or privacy concerns, technology use in the hospitality industry raises a myriad of legal issues to consider. The best advice: Think inclusively, analyze effects, and reach out to your legal counsel before you buy or commit to new technology.

Bruna Pedrini is an Of Counsel attorney with Fennemore Craig, P.C. Based in Phoenix, AZ, she specializes in the areas of accessibility, anti-discrimination, and education law, and she has experience representing builders, developers, colleges and universities, sports and concert stadiums and venues, as well as restaurants and hotels. She can be reached at [email protected] or 602-916-5487.

Share.

Comments are closed.