At Intersection of Marginalized Groups and Tech, Legal Questions Abound
CHBA CLE shines light on gray area of racial language and gender issues of AI

by Avery Martinez
Law Week watermark

Many aspects of our daily lives are increasingly dominated by technology and data. While debates surge about equality and representation in government, some are raising concerns about the supposed neutrality and objectivity of technology.

In turn, this can create many legal questions surrounding equality and diversity at the intersection of data, technology and law. Moreover, the area around these questions does not always have clear paths in terms of legal recourse.

“What we’ve seen a lot of times with technology is that there’s a disproportionate impact on marginalized and vulnerable populations,” said Marty Esquibel, a data protection professional and current HIPAA Privacy and Security Officer at the Colorado Department of Human Services. He noted that the opinions expressed throughout the presentation were his own.

On Dec. 17 Esquibel presented a continuing legal education program for the Colorado Hispanic Bar Association that introduced the intersection of race and technology. The varying effect may create legal questions about marginalized groups — not singular to Latinos. Both a lawyer and data professional, Esquibel questioned assumptions surrounding data and tech and their objectivity, especially about awe inspired by tech.

One issue, especially in American society, is the widespread interest in f technology and data, Esquibel said. Americans tend to think that delegating social problems to data and/or tech will solve them — because they’re objective, neutral and efficient. But that might not be the reality because “technology’s just a reflection of us,” Esquibel said. He believes there’s danger in placing faith in technology and believes that a particular concern is what develops in the tech being used. Because humans are flawed, he said, flaws can be reflected in software and data.

Esquibel said programming will do what it is told to do, not what someone intends it to do. As a result, bias can enter into the technology in areas ranging from who develops the tech, to implementation to use of the tools. These mismatches of intention and programming can lead to bias. Three areas in technology can produce bias, Esquibel said. Namely, those areas are in the development, data and deployment of technology and harmful bias can enter at any point.


This complete article appears in the Dec. 21 issue of Law Week Colorado. To read other articles from that issue, order a copy online. Subscribers can request a digital PDF of the issue.

Leave a Comment