Criminal Law

ABA on the ethical obligations of prosecutors in misdemeanor cases

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

*****

Today I came across a headline that I assumed at first glance was an Onion article or some other type of satire. It had to be. The ABA Journal headline stated something that I’ve always assumed was simply a given: “Prosecutors must maintain ethical conduct during misdemeanor plea deals, ABA ethics opinion says.”

Note that what the headline failed to mention was the the opinion addressed prosecutors obligations when interacting with unrepresented misdemeanor defendants. But even so - come on! How could there be any confusion on that issue? Prosecutors are lawyers and, just like the rest of us, they’re required to act ethically at all times. There’s no “misdemeanor or lower” exception to ethics requirements. At least, not that I’m aware of.

But, nevertheless, the fact that the ABA felt the need to weigh in on this is an indication that there was a need for further clarity on this issue. And, if the ABA deems this topic important enough to opine on, then I likewise believe that it’s important enough for me to write about. So if you’re as curious as I was about this opinion, then buckle up and let’s dive in.

In Formal Opinion 486, which was handed down on May 9th, 2019, the ABA Standing Committee on Ethics and Responsibility considered the ethical obligations of prosecutors when negotiating and plea bargaining with unrepresented individuals accused of misdemeanors.

At the outset, the Committee acknowledged that while most prosecutors perform their job functions ethically, that’s not always the case: “Notwithstanding the commitment of most prosecutors to high professional standards, there is evidence that in misdemeanor cases where the accused is or may be legally entitled to counsel, methods of negotiating plea bargains have been used in some jurisdictions that are inconsistent with the duties set forth in the Rules of Professional Conduct.”

The Committee then turned to the accused’s right to counsel, noting that it is unethical for prosecutors to interfere with this right in any way: “Under Model Rule 3.8(b) prosecutors must make reasonable efforts to assure that unrepresented accused persons are informed of the right to counsel and the process for securing counsel, and must avoid conduct that interferes with that process.”

Next, the Committee tackled the plea bargaining process, explaining that when a defendant is unrepresented, prosecutors must discuss the known consequences of a proposed plea deal with the accused. This is because an unrepresented defendant is in a uniquely vulnerable position. As such, “if the prosecutor knows the consequences of a plea – either generic consequences or consequences that are particular to the accused – the prosecutor must disclose them during the plea negotiation.”

The Committee further elaborated on the obligations of prosecutors in this situation and provided examples of impermissible conduct:

“Thus, where a prosecutor knows from the charge selected, the accused’s record, or any other information that certain collateral consequences or sentence enhancements apply to a plea on that charge, statements like the following would constitute prohibited misrepresentations:

‘Take this plea for time served and you are done, you can go home now.’

‘This is a suspended sentence, so as long as you comply with its terms, you avoid
jail time with this plea.’

‘You only serve three months on this plea, that’s the sentence.’”

The Committee then turned to a prosecutor’s ethical obligations when extending a plea offer to an unrepresented and clarified that prosecutors cannot do so unless there is sufficient evidence to support the plea offer: “Under Model Rules 1.1, 1.3, 3.8(a), and 8.4(a) and (d), prosecutors have a duty to ensure that charges underlying a plea offer in misdemeanor cases have sufficient evidentiary and legal foundation.”

Finally, the Committee noted that a prosecutor’s ethical obligations extend to post-plea interactions: “If a prosecutor learns during the plea colloquy with the court or other interactions that the unrepresented accused’s acceptance of a plea or waiver of the right to counsel is not in fact voluntary, knowing, and intelligent, or if the plea colloquy conducted by the court is inadequate to ascertain whether the plea or waiver of the right to counsel is in fact voluntary, knowing, and intelligent, the prosecutor is obliged to intervene.”

That this opinion was even issued, my friends, is an unfortunate reminder of the state of our profession in 2019. That being said, it serves as a welcome, and much-needed, reminder to prosecutors who may be walking a fine ethical line when it comes to many of these issues: always ensure that you walk on the right side of that line, or risk losing your license to practice law.

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software for small law firms. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. You can follow her on Twitter at @nikiblack or email her at niki.black@mycase.com. 


New York court on privacy expectations in social media accounts

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

*****

Every year around this time I begin to conduct research for the annual update to the Thomson Reuters criminal law treatise, “Criminal Law in New York,” that I co-author with Brighton Town Court Judge Karen Morris. During the course of my research I often discover cases that arise from interesting overlaps of technology and criminal law.

This year has proven to be no different, and last week I stumbled upon an interesting case from New City Criminal Court, which focuses on issues relating to whether the access to social media accounts by law enforcement triggers constitutional privacy interests.

In People v. Sime, 62 Misc.3d 429 (2018), one issue addressed by the Court was whether the defendant had a constitutionally protected privacy interest in the IP data and photograph metadata that she had uploaded and shared online via a public Instagram account.

In this case, the defendant was charged with, in part, unlawful disclosure of an intimate image in violation of Administrative Code of the City of New York § 10-177 [b][1]. It was alleged that the defendant, who was dating the complainant’s ex-boyfriend, posted nude photos of the complainant to two different Instagram accounts. The photos were allegedly taken by the complainant’s ex-boyfriend. One of the Instagram accounts was alleged to belong to the ex-boyfriend and the other was alleged to have been created in the complainant’s name by the defendant. As part of that prosecution, the court issued a search warrant on Instagram seeking access to the data connected with the two Instagram accounts.

The defendant challenged the search warrant, asserting that it was not supported by probable cause. She conceded that she did not have a privacy interest in the posted photos since they were shared on an account that was open to the public and had no privacy settings enabled. Accordingly, her argument was based instead upon the assertion that “there is a general right to privacy for the IP addresses associated with the person who posted the pictures and the metadata contained in the photographs public (sic.) posted pursuant to the recently decided case Carpenter v. United States, 138 S.Ct. 2206 [2018].”

As I explained in my article last week, in Carpenter the Court held that a warrant was required in order for law enforcement to access historical cell phone geolocation data. In the case at hand, the Court disagreed that the Carpenter holding was applicable on the gourds that IP data and metadata relating to an Instagram photo is not analogous to cell phone geolocation data.

The Court explained that unlike historical cell phone geolocation data, IP data does not necessarily provide information regarding the defendant’s specific location:

“Obtaining IP data does not provide the police the ability to exhaustively know a defendant's exact position — at best it might incidentally reveal what device was used to post a photograph in the general vicinity of an internet router. In other words, at most it will let the police find a building near the used cell phone or computer device on discrete dates when pictures were uploaded for the public to view, and has no bearing on the defendant's day-to-day movement…Similarly, photograph metadata might let you know what camera was used to take a particular picture, and (if it was not already obvious from the picture itself) where that picture was taken.”

Because IP data and metadata provide only a brief snapshot of the user’s location at any given time, the Court compared IP data and metadata to telephone billing records, in which customers have a lower expectation of privacy: “IP data and metadata are roughly analogous to telephone billing records, and there is no legal reason to protect this data to the same extent as long-term GPS data and cell-site information.”

Accordingly, the Court denied the defendant’s motion challenging the search warrant, concluding that “(T)here is no constitutional privacy afforded to the IP data and photograph metadata that the defendant uploaded and shared with the world, nor would a subjectively held privacy expectation be reasonable or one that society is prepared to recognize.”

Digital privacy rights are an important and evolving issue. Now that online interaction and mobile device usage are commonplace, data regarding all aspects of our daily lives is regularly collected by a host of third parties. As law enforcement increasingly seeks access to that information, courts will necessarily continue to grapple with the constitutional nuances presented by varying factual scenarios - and rest assured, I’ll continue to cover their efforts in this regard.

 

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software for small law firms. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. You can follow her on Twitter at @nikiblack or email her at niki.black@mycase.com. 


Massachusetts weighs in on law enforcement access to real-time geolocation data

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

*****

 

Now that most Americans own smartphones, privacy issues abound. Our devices collect a vast array of information about us. Some of this data is stored on our devices and some is shared with our service providers. As a result, one issue that has cropped up repeatedly is when and how law enforcement may access cell phone data.

One particular type of data often sought by law enforcement is geolocation information. Our mobile devices provide both real-time and historical data regarding our location at any given time. Obviously this information has the potential to be incredibly valuable in the context of a criminal investigation, so it’s not surprising that law enforcement often seeks to obtain it.

The United States Supreme Court addressed the issue of whether law enforcement may obtain historical cell phone records last year. In Carpenter v. U.S., 138 Sup. Ct. 2206 (2018), the Court held that a warrant was required in order to access historical cell phone geolocation data.

The law is not yet settled regarding access to real-time cell phone data, however, so I read with interest a Massachusetts Supreme Court opinion that was handed down last week that addressed this very issue. In Commonwealth v. Almonor, No. SJC-12499, the Court considered whether “whether police action
causing an individual’s cell phone to reveal its real-time location constitutes a search in the constitutional sense.”

In this case, the defendant was identified as a murder suspect, and one of the witnesses to the crime provided police with the defendant’s name and cell phone number. After obtaining other evidence, the investigating officer contacted the defendant’s cell phone provider and requested several pieces of information, including the precise, real-time location of the defendant’s cell phone.

Eventually the provider “pinged” the defendant’s cell phone and provided law enforcement with the exact location of the defendant’s cell phone. Officers then drove to that location, obtained consent to enter the home, and arrested the defendant therein. The defendant moved to suppress the arrest on the grounds that the ping of the defendant's cell phone was a search under the Fourth Amendment and Article 14 of the Massachusetts Constitution.

In reaching its decision on the issue, the Court acknowledged that a delicate balance was required when considering the enhanced surveillance capabilities that technological advances provided law enforcement. The Court explained that it is important to carefully “guard against the…power of technology to shrink the realm of guaranteed privacy…(and) that privacy rights cannot be left at the mercy of advancing technology but rather must be preserved and protected as new technologies are adopted.”

The Court noted that when police direct a service provider to “ping” a cell phone to determine its real-time location, it raises “distinct privacy concerns,” especially since said data would not be collected in the absence of law enforcement’s request. Notably, the Court determined that there is a reasonable expectation of privacy in this situation since cell phones are such an indispensable part of our lives and provide an incredible amount of information about their owners. The Court explained that “society reasonably expects that the police will not be able to secretly manipulate our personal cell phones for any purpose, let alone for the purpose of transmitting our personal location data.”

As such, the Court concluded that it constitutes a search when law enforcement obtains real-time location data from a cell phone provider, since doing so intrudes on the cell phone owner’s reasonable expectation of privacy. The Court explained that to conclude otherwise would “shrink the realm of guaranteed privacy…under art.14 and leave legitimate privacy rights at the…mercy of advancing technology."

Although the Court held that the exigent circumstances exception applied to the facts of this case, the overall holding is a step in the right direction.

Technology is pervasive in our lives and offers so many benefits. But when used by law enforcement, can sometimes be abused in new and increasingly invasive ways. Decisions like this one provide much-needed analysis and insight into the application of constitutional protections in the face of rapidly evolving technological innovation.

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software for small law firms. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. You can follow her on Twitter at @nikiblack or email her at niki.black@mycase.com. 


When technology and law enforcement collide

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

*****

Law enforcement officers have no problem using the latest and greatest technologies to police the people, whether it’s using facial recognition tools, cellphone geolocation data, or recordings obtained from smartphone technologies such as Amazon’s Alexa. But it seems that when the people use the very same tools to police the police - well, that simply won’t do.

For example, we know that the police typically don’t like being recorded while effecting an arrest and will often order bystanders to refrain from doing so, and have even been known to take custody of devices and delete data from them. Along the same lines, law enforcement has never been a fan of a more mundane and less tech-savvy practice that many motorists engage in: flashing their headlights in order to warn other motorists of a speed trap.

So I wasn’t surprised to learn that the New York Police Department had set its sights on the 21st century version of headlight flashing: the Waze app’s user-submitted reports regarding speed taps and DWI checkpoints.

According to the New York Times, last weekend the NYPD’s acting deputy commissioner for legal matters, Ann P. Prunty, sent a letter on behalf of the NYPD to Google (the owner of the Waze app) to demand that it remove that feature from Waze. The rationale for this request was as follows: “The posting of such information for public consumption is irresponsible since it only serves to aid impaired and intoxicated drivers to evade checkpoints and encourage reckless driving. Revealing the location of checkpoints puts those drivers, their passengers, and the general public at risk.”

If Google refused to do so, Prunty indicated that the NYPD would pursue all legal remedies available to it to achieve its goal of preventing people from sharing said information via the app.

For starters, this request, if granted, likely infringes on the First Amendment rights of ordinary citizens, but that’s an issue that the courts will have to grapple with if legal remedies are indeed pursued by the NYPD. That’s certainly an interesting issue, but what I found to be even more interesting was that the letter was a perfect example of a knee jerk reaction to technology.

I say this because people have always found ways to share information regarding the arrival or location of the police. There are code words used by kids on the street that warn others when police appear on the scene. And, as mentioned above, motorists flash their headlights after encountering a speed trap to warn other drivers. Similarly, truck drivers use their CB radios to communicate the whereabouts of police to other truckers. And certainly cell phones have been used by motorists for the purposes of sharing information via phone calls for that same reason as well.

In other words, citizens have always found ways to communicate with one another with the end goal being to avoid police interaction. But in the past they’ve used the only methods available to them at the time, which were certainly less effective and not nearly as far-reaching as an app like Waze.

Enter technology and the power of social media, and suddenly ordinary citizens have the ability to broadcast their observations of law enforcement activities far and wide. It’s important to note, however, that while the efficiency and reach of the information sharing has improved, the essence of it is the same. It’s simply people communicating with one another regarding situations that are occurring in plain sight. Technology and social media have simply amplified their voices.

In other words, as I’ve oft repeated in this column since 2008, the medium doesn’t change the message. And in this case, I would argue that the message falls within the parameters of free speech, and that imminent danger exception does not apply. The fact that the message is now more easily transmitted to a larger number of people doesn’t change that fact.

The NYPD seems to have lost sight of the fact that the online is simply an extension of the offline. Should it follow through with its threat to litigate, this will be an interesting case to follow. I strongly suspect that First Amendment rights will trump law enforcement’s knee jerk reaction to technological innovation, but only time -and a lawsuit - will tell if I’m right.

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software for small law firms. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. You can follow her on Twitter at @nikiblack or email her at niki.black@mycase.com. 


Federal judge on whether biometric access to phones requires a warrant

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

*****

A decade ago, smartphones were in their infancy. The iPhone was not even a year old and widespread adoption had not yet occurred. Many were suspicious of the touch screen interface, and lawyers in particular clung to the idea that they required the tactile feel of a traditional keyboard.

Fast forward to 2019, and smartphones are commonplace even amongst lawyers. In fact, according to the latest ABA Legal Technology Survey Report, 95% of all lawyers use smartphones on a daily basis.

Not only has smartphone usage grown over the past decade, so too have the technologies that power the devices. Today’s smartphones are essentially minicomputers with memory and processing power comparable to that of some desktop and laptop computers. For that reason, smartphones have become indispensable and people store all sorts of information on them.

It’s no surprise then that law enforcement routinely seeks access to smartphones of suspected criminals. Of course, constitutional protections still apply. For example, for a number of years now, it has been generally accepted that law enforcement cannot require you to provide the password to your smartphone, since doing so is compelled testimony and thus falls under the protection of the Fifth Amendment.

However, with the release of smartphones with biometric unlocking features, the waters were muddied. Many courts subsequently concluded that the biometric data used to unlock phones (ie. fingerprints and faces) is not inherently testimonial and thus requiring a defendant to open a device using biometric data does not violate the Fifth Amendment.

The tide may be turning, however, with the release of a recent federal district court decision on January 10th. Northern District of California Magistrate Judge Candice A Westmore considered this very issue and issued an important ruling in The Matter of the Search of a Residence In Oakland, California (online: https://tinyurl.com/ycs4wdy7). Specifically, the Court considered whether law enforcement should be granted a search warrant that required any individual present at the time of the search could be compelled to “press a finger (including a thumb) or utilize other biometric features, such as facial or iris recognition, for the purpose of unlocking the digital devices found in order to permit a search of the contents…”

In reaching its decision, the Court first concluded that the search request was overly broad, and that there was insufficient probable cause to: 1) compel anyone other than the suspects to unlock their devices or 2) to seize the device of anyone other than the suspects who were present at the time of the search.

Next the Court turned to the issue of whether the suspects could be required to provide biometric data to unlock any devices that were reasonably believed to belong to the suspects. At the outset the Court wisely noted that because of the rapid pace of technological change, courts must adopt rules that take into account more sophisticated technologies that currently exist or are in development and that courts “have an obligation to safeguard constitutional rights and cannot permit those rights to be diminished merely due to the advancement of technology.”

The Court then turned to ascertaining whether providing biometric data is a testimonial act, and concluded that it was: “(A) biometric is analogous to the nonverbal, physiological responses elicited during a polygraph test, which are used to determine guilt or innocence, and are considered testimonial.”

Finally the Court reiterated the Supreme Court’s 2014 determination in Riley v. California that today’s smartphones contain large amounts of incredibly private data regarding the owner of the phone and others with whom that person communicates: “smartphones are minicomputers…a search of which ‘would typically expose the government to far more than the most exhaustive search of a house. A phone not only contains in digital form many sensitive records previously found in the home, it also contains a broad array of private information never found in an home in any form…’”

For any number of reasons, this ruling is notable. For starters the Court acknowledged the undeniable effects of the rapid pace of technology on our culture. It was reassuring to read this thoughtful and insightful ruling, especially since it took into account the nature of rapidly evolving technologies and how they may potentially - and sometimes unintentionally - impact our constitutional rights. Also of import is the Court’s understanding of existing technology and its on-point comparison of it to more traditionally accepted testimonial evidence.

In short, I believe that the conclusion reached by the Court was the correct one. Let’s hope other courts follow suit.

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software for small law firms. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. You can follow her on Twitter at @nikiblack or email her at niki.black@mycase.com. 


Fitbit Evidence Provides Alibi For Victim’s Boyfriend

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

*****

Fitbit Evidence Provides Alibi For Victim’s Boyfriend

Last week, I wrote about a recent case where Fitbit data was used in a California case to convict the defendant, the victim’s step-father, of her murder. In that case, the victim was wearing a Fitbit and her heart rate data obtained from the device conflicted with the defendant’s version of events, ultimately resulting in his conviction.

That wasn’t the first time I covered the impact of wearable devices in court. in 2015, I wrote about two cases where Fitbit data was used in litigation: one where it was offered as evidence to support a personal injury claim and the other where it was used to disprove a complainant’s rape allegations. Then in 2017, I covered a case where Fitbit data and other digital evidence was used in a Connecticut murder prosecution to convict the defendant of murdering his wife.

Another criminal case from earlier this year in Wisconsin that I haven’t yet covered was notable because it involved Fitbit data being used as alibi evidence. In that case, the defendant, George Burch, alleged that the victim’s boyfriend, Doug Detrie, had forced him to commit the murder at gunpoint. However, a host of evidence, digital and otherwise, belied his assertion.

A good portion of the digital evidence used to pinpoint Burch’s movements on the night of the murder was obtained from his cell phone and Google Dashboard. By using that data, expert witnesses were able to show the jury that Burch was at the scene of the murder on the morning in question and then subsequently traveled to the location where they body was disposed of after the murder was committed.

Burch’s defense was that although he committed the murder and disposed of the body, he did so because Detrie held him at gunpoint and forced him to commit those acts. Fortunately for Detrie, he was wearing a Fitbit at the time of the murder and the Fitbit data contradicted Burch’s claims.

Not all of the Fitbit data was admissible, however. Specifically, the data that showed that Detrie was sleeping at the time of the murder was held to be inadmissible due to scientific disagreement regarding the reliability of that specific data. Other Fitibit data was deemed admissible, however, and that data provided an alibi that made all the difference in this case.

According to the Fitibit data, Detrie didn’t take nearly the number of steps required on the evening of the murder for his activity levels to comport with the movements alleged by Burch. The Fitibit data showed that Detrie took 20-30 steps at approximately 4 a.m. on the morning of the the murder. He asserted that he went to the bathroom at that time. Burch’s claims would have required Detrie to walk at least 2 or more miles on the evening of the murder.

After hearing the testimony and considering the evidence, the jury concluded that Burch was wearing his Fitbit on the evening of the murder and that the data obtained from it was accurate - and provided him with a much-needed alibi. The jury thus discounted Burch’s version of events and convicted him of the murder.

This is yet one more example where data from a wearable device provided crucial evidence that made all the difference in the outcome of the case. It’s also further proof that the devices we rely on and carry with us 24/7 collect a wealth of information about our movement and activities, all of which is readily accessible by law enforcement, sometimes with, and other times without, a warrant.

Certainly this should give you pause, and if nothing else, you might want to check the privacy settings of your smartphones, wearable devices, and the online accounts that sync with your mobile devices. Ascertain what type of data is collected and for what purpose, and then determine the value of the services provided using that data. If it’s not all that important to you, then switch off the ability to collect that data, to the extent that it’s possible.

No doubt there are plusses and minuses to living in the 21st century. The benefits include convenience, flexibility, and 24/7 access to information, but when balanced with the loss of privacy, are sometimes outweighed. The good news is that in some cases, the digital data can be your friend and provide you with an alibi, but that’s not always the case. The decision regarding how much privacy to sacrifice in order to take advantage of the positive aspects of living in the digital age is a personal one.The choice is yours, and it’s not always an easy one to make.

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. You can follow her on Twitter at @nikiblack or email her at niki.black@mycase.com. 


Fitbit Data Used As Evidence In A New Murder Case

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

***

Fitbit Data Used As Evidence In A New Murder Case

Wearable devices are becoming incredibly common. Take a look around - you’ll notice Fitbits, Apple Watches, and other wearable devices on the wrists of many people whom you encounter on a daily basis, including your legal colleagues and co-workers. They’re being used to track people’s health and fitness information, to ensure people are notified of important messages and events, and to assist with navigation, among other things.

Because they track so many aspects of our lives, the data collected and stored on the devices and shared with our phones can sometimes prove invaluable in court. I find their evidentiary potential to be incredibly interesting, so I started following and writing about cases where data from wearable devices has been used as evidence in litigation. For example, in 2015, I wrote about two cases where Fitbit data was used in litigation: one where it was offered as evidence to support a personal injury claim and the other where it was used to disprove a complainant’s rape allegations.

Then in 2017, I covered a case where Fitbit data and other digital evidence was used to support a Connecticut murder prosecution. The digital evidence included cellphone records for the defendant and his wife, 2) computer records from the defendant’s laptop, 3) Facebook records for the defendant, his wife, and his girlfriend, 4) text messages, and 5) Fitbit records for the victim, the defendant’s wife.

Now, there’s a new case where Fitbit data is being used in a murder prosecution, this time in California. In this case, the accused is the step-father of the victim. The victim was discovered in her home on Thursday, September 13th by a coworker after she failed to show up for her job. She was deceased, slumped over a desk, and was wearing a Fitbit while holding a butcher knife. She had sustained a deep cut to her neck. What initially appeared to be a suicide was later determined to be a homicide after the medical examiner determined that she’d suffered from many deep wounds to her head and face.

When questioned by police, her step-father informed them that he had stopped by her home on Saturday, September 8th to drop off pizza. He also stated that later in the day he saw her again when she drove by his home with someone in the passenger seat of her car. He denied harming her.

However, evidence obtained by the investigating officers conflicted with his account. First, there was surveillance video showing that his car had been at her home for 21 minutes on Saturday, September 8th, from 3:12 - 3:33 pm. The video did not show her driving from her home in her car subsequent to that point in time, despite the defendant’s claims to the contrary.

There was also digital data obtained from the victim’s Fitbit. It showed that her heart rate spiked at 3:20 p.m. on September 8th. It then slowed down quickly and her Fitbit stopped registering a heartbeat at 3:28 p.m. In other words, her Fitbit showed that her heart had stopped beating during the timeframe that the defendant’s car was at her home.

Based on the surveillance video and Fitbit evidence, and his conflicting account, he was arrested and charged with her murder. The case is still pending, so his ultimate fate remains unknown. But it’s a great example of the valuable evidence that can be obtained from wearables. While certainly not conclusive, when considered in conjunction with other evidence discovered throughout an investigation, this type of data can sometimes make - or break - a case. Tune in next week for an example of a case where, instead of making the prosecution’s case, wearable data instead provided the accused with a viable alibi.

 

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. You can follow her on Twitter at @nikiblack or email her at niki.black@mycase.com. 


Juror misconduct and technology: a perfect storm

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

*****

 

Juror misconduct and technology: a perfect storm

As I mentioned in recent columns, I’m in the process of drafting my half of the annual update to “Criminal Law in New York,” a substantive criminal law treatise that I co-author with Brighton Town Court Judge Karen Morris. Every year, during the course of my research, I often stumble upon cases that offer an interesting perspective on the intersection of law and technology. This year was no different, and one particularly timely issue that I encountered involved juror misconduct occurring due to the improper use of technology by jurors.

Oftentimes these types of cases are discussed in the context of jurors using social media platforms to discuss trial proceedings despite being instructed not to do so, but the two cases that caught my eye while researching cases this summer involved jurors improperly using other types of technology in ways that were alleged to have had an impact on criminal trials.

In this column I’ll discuss the first case, People v. Neulander, 162 A.D.3d 1763 (4th Dep’t 2018), where the defendant was convicted of murder in the second degree. One issue on appeal was whether a number of text messages sent by a juror during the trial to friends and family constituted juror misconduct that created a significant risk that a substantial right of defendant was prejudiced.

Specifically, as established during the hearing on the defendant’s motion to set aside the verdict, the juror in question sent the following text messages to her father and her friends during the trial:

(A) text message from her father that stated: “Make sure he's guilty!” During the trial, juror number 12 received a text message from a friend asking if she had seen the “scary person” yet. Juror number 12 responded: “I've seen him since day 1.” Juror number 12 admitted at the subsequent hearing into her misconduct that she knew that the moniker “scary person” was a reference to defendant. Another friend sent juror number 12 a text message during the trial that stated: “I'm so anxious to hear someone testify against Jenna [defendant's daughter].” Juror number 12 responded: “No one will testify against her! The prosecution has already given all of his witnesses, we are on the defense side now! The prosecutor can cross examine her once she is done testifying for the defense.” Later that night, the same friend replied via text message: “My mind is blown that the daughter [Jenna] isn't a suspect.”

This conduct was reported to the court by an alternate juror after the guilty verdict had been rendered. In the juror’s affidavit in opposition to the motion to set aside the verdict, the juror stated that she had followed all of the court’s instructions. Nevertheless, a subsequent forensic examination of her cell phone showed that she had deleted many messages and erased her web browsing history, and she was unable to provide any explanations for doing so.

Based on the evidence adduced at the hearing, the court granted the defendant’s motion for a new trial, concluding that “due to juror number 12's flagrant failure to follow the court's instructions and her concealment of that substantial misconduct, defendant, through no fault of his own, was denied the opportunity to seek her discharge during trial on the ground that she was grossly unqualified and/or had engaged in substantial misconduct…thus…(the) defendant established by a preponderance of the evidence that juror number 12 engaged in substantial misconduct that ‘created a significant risk that a substantial right of ... defendant was prejudiced.”

This case is a great example of the reality that even tools as familiar and simple as texting can have a significant impact on trials. So don’t make the mistake of discounting or overlooking the potential effect of “old school” technology on your client’s case.

In next week’s column, I’ll discuss a juror misconduct case whereby jurors conducted legal research on their home computers and also used video editing software to enhance images from a video in evidence. So make sure to tune in next week!

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. You can follow her on Twitter at @nikiblack or email her at niki.black@mycase.com. 


Does accessing historical cell site information require a warrant?

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

*****

Does accessing historical cell site information require a warrant?

Every summer I write my portion of the annual update for “Criminal Law in New York,” a book on substantive New York criminal law that I co-author with Brighton Town Court Judge, Karen Morris. During my research of the criminal cases handed down over the past year, I often come across cases that provide interesting insights on the intersection of criminal law and technology, and then write about them in this column.

This year one of the cases I discovered was People v. Jiles, 158 A.D.3d 75 (4th Dept. 2017), leave to appeal denied, 2018 WL 3811362 (2018). In this case, the defendant was convicted of murder in the second degree, robbery in the first and third degrees, and criminal possession of a weapon in the second degree. The defendant and unidentified accomplices were accused of holding four men at gunpoint in an apartment and taking money from them. During the robbery, another person entered the apartment, a struggle ensued, and the victim was shot and killed.

The prosecution had obtained the defendant’s cellphone records for the 4 days leading up to the robbery by means of a court order which was issued upon a showing of less than probable cause pursuant to the federal Stored Communications Act. The prosecution sought to introduce the records at trial to show the defendant’s location during the various times that he’d called the victim in the days preceding the robbery. The defendant moved to suppress the location information, but not the call records, on the grounds that the acquisition of that information constituted a search and thus required a warrant supported by probable cause.

The trial court denied the motion and the records were admitted at trial. One issue on appeal was whether it was necessary for law enforcement to obtain a warrant in order to access historical cell phone data.

At the outset, the Court outlined the breadth of information that can be obtained from cell phone data: “When citizens go about their lives with cell phones turned on, the phones can electronically register with the nearest cell tower every few seconds whether or not the phones are actively in use, and the business records of service providers can therefore contain information about the location of phones and their users at specific dates and times as the users travel the highways and byways of our state and nation.”

Next, the Court turned to the issue at hand, and in reaching its decision, it emphasized that the location data sought was historical, was kept as a matter of course by his cell phone service provider much like telephone billing records, and was information that the defendant had voluntarily disclosed to his services provider.

The Court explained that because the information was not obtained as a result of direct surveillance by law enforcement, but instead constituted historical data voluntarily provided to his service provider, a third party, that the data fell under the third party exception: “(W)e conclude that the acquisition of the cell site location information was not a search under the Fourth Amendment to the federal constitution because defendant's use of the phone constituted a voluntary disclosure of his general location to his service provider, and a person does not have a reasonable expectation of privacy in information voluntarily disclosed to third parties.”

The Court also noted that “certain other states have afforded cell site location information greater protection under their state constitutions than it is afforded under the federal constitution” but declined to do so in New York, and instead likened location data to telephone billing records, which the New York Court of Appeals has permitted access to in the absence of a warrant. As such the Court concluded that “there is ‘no sufficient reason’ to afford the cell site location information at issue here greater protection under the state constitution than it is afforded under the federal constitution.”

I’m not sure I agree with the Court’s conclusion that this type of data is “voluntarily” disclosed to cell phone service providers, since at the present time, those of us who choose to use cell phones don’t have much of a choice in that regard: service providers collect that data as a matter of course and there’s no mechanism available to opt out. So our hands are essentially tied in that regard.

The Court did address this particular argument at the end of its opinion, noting that “(t)o the extent that ‘cell phone users may reasonably want their location information to remain private’ under these circumstances, their recourse is ‘in the market or the political process…”

Unfortunately, New York residents interested in protecting privacy rights and preventing law enforcement from arbitrarily accessing our minute-by-minute movements in the digital age, the Court’s suggested course of action provides no small comfort - and no immediate recourse - other than refraining from using cell phones altogether. For the vast majority of us, that’s not a feasible option, and thus we’re forced to agree to a full-scale waiver of our privacy rights in exchange for the ability to use a piece of technology that has become an integral part of our daily lives. Not exactly an equitable bargain, if you ask me.

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. You can follow her on Twitter at @nikiblack or email her at niki.black@mycase.com. 


Can consent to search be obtained via Google Translate?

Stacked3Here is a recent Daily Record column. My past Daily Record articles can be accessed here.

*****

Can consent to search be obtained via Google Translate?

Technological advances over the past decade have occurred at an unprecedented rate. As a result, there have been drastic improvements in machine learning and artificial intelligence technologies in recent years, making many science fiction fantasies a newfound reality. A great example of this is Google Translate, a tool that instantly translates speech.

Within the last few years, Google Translate has become widely available as a free online and mobile app and provides the immediate ability to translate words, both spoken and written, from one language to another. Because it’s so easily accessible, it should come as no surprise that it was recently used by law enforcement to interact with a suspect, resulting in a case that addressed an interesting constitutional question. Specifically, earlier this month, in U.S. v. Cruz-Zamora, the United States Court for the District of Kansas considered the issue of whether a non-English speaking individual can consent via Google Translate to a search of his car by law enforcement.

The case arose from a traffic stop which was initiated because of the defendant’s suspended registration. At the beginning of the encounter, the officer realized that the defendant spoke very little English. He then moved the defendant to his patrol vehicle and began to communicate with him using Google Translate via his car’s laptop. While speaking to him using Google Translate, the defendant allegedly gave the officer permission to search his vehicle, which the officer did, leading to the discovery of illegal drugs.

The defendant later alleged that the search was unconstitutional. During the suppression hearing, the officer admitted that a live translator would have been preferable but none were available. He also admitted that the defendant didn’t always understand his questions.

Two professional interpreters also testified at the hearing, and after reviewing the video and audio recordings of the encounter, both opined that it was clear that the defendant was often confused when responding to questions and didn’t always seem to understand what was being asked of him. They also testified that Google Translate failed to take context into consideration and thus “should only be used for literal word-for-word translations.”

In its opinion, the Court initially explained that it was the defendant’s contention that “any evidence obtained as a result of the car search should be suppressed because he did not understand (the officer) and therefore could not knowingly consent to the search.”

Next, the court determined, based primarily on the testimony of the professional interpreters, that “while it might be reasonable for an officer to use Google Translate to gather basic information such as the defendant’s name or where the defendant was travelling (sic), the court does not believe it is reasonable to rely on the service to obtain consent to an otherwise illegal search.”

The Court explained that although the audio and video recordings of the encounter showed that the defendant had a basic understanding of the questions asked of him, the testimony of the interpreters and a review of the transcript indicated that the defendant’s purported consent to search was invalid. The Court concluded that it did “not find the government ha(d) met its burden to show defendant’s consent was ‘unequivocal and specific and freely and intelligently given.’’

Next the court turned to an alternative argument made by the government: that the good faith exception applied, and thus the evidence should not be suppressed. Specifically, the government contended that the officer acted in good faith since he reasonably relied on Google Translate and its translations. In opposition, the defendant asserted that the officer could not “reasonably rely on a mistake of his own making.”

The Court agreed with the defense, and excluded the evidence:

“(T)he good-faith exception does not apply as it is not reasonable for an officer to use and rely on Google Translate to obtain consent to a warrantless search, especially when an officer has other options for more reliable translations. The government has not met its burden to show defendant’s consent was “unequivocal and specific and freely and intelligently given,”…and the court will not interpret defendant’s compliance with Wolting’s instructions to stand by the side of the road during the search as implied consent, considering the totality of the circumstances. The court finds that application of the exclusionary rule is appropriate in this case, and therefore grants defendant’s motion to suppress.”

The lesson to be learned is that while the technology has dramatically improved in recent years, it’s often far from perfect. Tools like Google Translate are improving by leaps and bounds, but it is ill-advised to indiscriminately relying on them when comprehension is crucial and carries legal ramifications. Technology is not a panacea; it merely supplements hard-earned technical skills and expertise - it doesn’t replace them.

 

Nicole Black is a Rochester, New York attorney, author, journalist, and the Legal Technology Evangelist at MyCase  law practice management software. She is the author of the ABA book Cloud Computing for Lawyers, co-authors the ABA book Social Media for Lawyers: the Next Frontier, and co-authors Criminal Law in New York, a Thomson Reuters treatise. She writes legal technology columns for Above the Law and ABA Journal and speaks regularly at conferences regarding the intersection of law and technology. She can be reached at niki.black@mycase.com.