DCaseyF Profile Banner
D. Casey Flaherty Profile
D. Casey Flaherty

@DCaseyF

Followers
4K
Following
6K
Media
421
Statuses
7K

Co-Founder and Chief Strategy Officer at LexFusion. Co-Founder of Procertas. People + Process + Tech Evangelist.

Austin, TX
Joined March 2013
Don't wanna be here? Send us removal request.
@DCaseyF
D. Casey Flaherty
9 years
I wrote a book. I hope you like it.
Tweet media one
5
39
92
@DCaseyF
D. Casey Flaherty
1 year
Best analogy I can muster: Microsoft combing through customer cloud data because it might be inflammatory or illegal—legitimate concern but insufficient to overcome the interest in shielding enterprise data from 3rd parties. On this, Microsoft has a clear and correct policy:
Tweet media one
0
0
1
@DCaseyF
D. Casey Flaherty
1 year
I deserve no credit for flagging this issue. The concern was flagged for me by a law firm denied an exemption. I simply ran it down because their commendable commitment to reading the fine print’s fine print sparked my curiosity and sense of duty.
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
Small share ≠ zero. I have found law departments, law firms, and legaltech vendors painfully aware of abuse monitoring. Some have been granted exemptions. Most have been denied exemptions because they are “unmanaged.”
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
Microsoft Copilot is on a relatively short (not blank) list of exemptions. Some anecdata from talking to law departments, law firms, and legaltech vendors about this subject indicate:
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
Seems like a bad look. Further, I would be willing to wager a not insubstantial sum that Microsoft has exempted Microsoft—i.e., as an enterprise—from abuse monitoring (pure speculation).
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
But you know who does qualify to turn off abuse monitoring, and has, in fact, opted out of abuse monitoring by Microsoft employees?.‍.‍Microsoft. Specifically, Microsoft Copilot
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
Absent anything official from Microsoft, my only option is to fall back on anecdote as the best available evidence. Based on the conversations I’ve had thus far, it seems very few customers and partners are managed and, therefore, eligible to turn off abuse monitoring.
2
0
1
@DCaseyF
D. Casey Flaherty
1 year
And this, dear reader, is where my search skills reach their limit. What are the eligibility criteria and process for becoming a “managed customer” or a “managed partner” and therefore eligible to have abuse monitoring turned off?. ¯\_(ツ)_/¯.
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
Again, like turning off abuse monitoring, this appears, superficially, to merely be a matter of paperwork. An enterprise can register for Limited Access, BUT only if they are a managed account—a status for which they cannot register, per the same Limited Access resource:
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
This “managed customers” language is similarly present in the accompanying resource on Limited Access.
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
"Limited Access criteria" starts to come into focus in the Azure OpenAI Limited Access Review form used to apply for “modified abuse monitoring”—also sometimes termed an “exemption” and meaning that abuse monitoring is turned off for specific use cases.
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
I underestimated how few would read down and take the steps necessary. AndI failed to read the fine print’s fine print to understand how few enterprises—clients, law firms, and legaltech companies—meet the “additional Limited Access eligibility criteria.”
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
Candidly, I had seen this before. But I did not think much of it because there is a mechanism for turning off abuse monitoring. I figured everyone would turn it off, and that would be that. How wrong I was.
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
If a prompt or generated content triggers the abuse monitoring system, the prompts and generated content are subject to human review by a Microsoft employee.
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
Excellent. But fine print follows. Below the blue box, we eventually learn that Microsoft has setup a process for “abuse monitoring.” Under their abuse monitoring program, Microsoft stores all prompts and generated content for 30 days.
Tweet media one
1
0
0
@DCaseyF
D. Casey Flaherty
1 year
For most law departments, law firms, and legaltech vendors I have spoken to, this translates into an (erroneous) assumption that there is no added risk of disclosure when pairing AOAIS with private company data already stored on Azure—the same misimpression I was under.
Tweet media one
1
0
0