A lawyer’s computer program offers to select potential jurors based on their ethnicity, political opinions, and profession in order to find the jury most favorable to a defense attorney’s case.
Momus Analytics, the company was founded by attorney Alex Alvarez, crawls the social media accounts of potential jurors and uses the results to predict whether or not they should be chosen.
The program includes a race-based algorithm that suggests Asians, Central American and South Americans are more likely to be leaders – a quality the program seems to appreciate. People who described their race as “other” were likely to be leaders.
Alvarez, who worked with Texas-based software designer Frogslayer to develop the program, has a patent pending for the program.
New computer program aims to help overworked lawyers select potential jurors for their cases by analyzing data from their various social media profiles
The system will then assign numerical scores to the potential juror in three potential categories: leadership, social responsibility and personal responsibility.
The program will use information such as a person’s occupation, political affiliation, education level and even ethnicity to determine the values ââof each category, according to a report published in Vice.
The system can also check for potential discrepancies in the data between a person’s different social media accounts, which indicates that they might not be reliable.
Alvarez says the idea first came to him after a jury ruled against one of his clients in what he believed was a simple personal injury lawsuit.
âIf the reason I win or lose, or a lawyer wins or loses, is based on their skill level, then why did this happen? He said in a short promotional video explaining the project.
“And that started me on a quest to find out why juries decide affairs and how juries decide affairs in America.”
Alex Alvarez founded Momus Analytics after a jury ruled against one of his clients in a personal injury case he deemed simple and straightforward.
Juries are selected through a process called a voir dire, giving prosecutors and defense lawyers the opportunity to assess jurors.
Alvarez believes that an additional analytical tool to help lawyers make better use of their limited time with jurors on the voir dire could lead to better results.
“The voir dire system can be difficult given the very short time within which a lawyer must question jurors,” argues Momus’ patent application.
Lawyers are allowed to question potential jurors before a trial in a process known as a voir dire, but they are only allowed to dismiss a limited number of jurors and often do not have enough time to research the jurors. history of potential jurors.
WHAT TO SEE SAY?
The voir dire is a process used to select jurors before a jury trial.
The term combines two French words, for “to see” and “to speak”, which colloquially means “to speak the truth”.
Both prosecutors are allowed to interview potential jurors to try and select the candidates they believe will be the most suitable.
According to the state, prosecutors and defense attorneys are allowed to automatically disqualify a set number of jurors for whatever reason they choose.
The number of times they are allowed to do this is usually limited to ensure that the process does not become lengthy.
âYou are looking for people who need to be removed (from the jury roll) and your questions need to be designed to uncover who should be removed,â lawyer and jury expert Jeffery T. Frederick said of the process.
âLawyers often have thirty minutes or less to conduct the voir dire process. Challenges are available for each part in order to disqualify a juror, with or without reason, according to the prejudices revealed by the jurorâ¦ but whatever their number, they are generally limited.
“Accordingly, a method or system to quickly assess a juror’s background for possible bias, within the short time allowed for the voir dire, would be beneficial to a lawyer seeking to select a favorable jury. with a limited number of challenges available. “
Andrew Ferguson, a law professor at American University, warns that the firm’s approach may introduce its own set of unexamined biases, which could violate the racial and gender discrimination law’s prohibition in selecting the jury.
âThe idea that the algorithm is going to weight race or gender or other protected classes in a way that could be critical to the outcome,â he told Vice, âthat’s a problem, because then you sort of cleanse your racialized assumption of people, ‘