Practicing with Integrity in a Digital World
Ongoing learning to keep care human, ethical, and accountable in a digital world
Technology is now part of nearly every aspect of clinical work.
How we schedule.
How we document.
How we message.
How we meet with clients.
How we store information.
Even how decisions get made.
Email, telehealth, digital forms, and AI tools are no longer add-ons to therapy — they’re woven into the fabric of practice.
And with every layer of convenience comes something equally important:
Ethical responsibility.
Part of my professional commitment is staying current with how technology and artificial intelligence impact therapy — not just learning how to use new tools, but understanding their risks, limitations, and ethical implications.
My goal is simple:
Technology should support care, never replace the human relationship at the heart of therapy.
Below are some of the trainings that have shaped how I think about digital ethics, boundaries, and responsible AI use in clinical practice.
Digital Ethics in Social Work Practice
SWTP CEUs (ASWB ACE-approved provider #2486)
Completed 2026
This course offered a broad, practical foundation for navigating ethics in today’s digital clinical landscape.
It walks through how the NASW Code of Ethics applies to online and technology-mediated care, blending policy guidance with real-world scenarios so the material feels immediately usable rather than abstract. A core theme that stayed with me was the idea that digital tools are ethical amplifiers — technology doesn’t lessen responsibility; it magnifies it.
The training explores:
ethical decision-making in digital spaces
HIPAA, encryption, and secure communication practices
texting, email, and boundary-setting with clients
social media professionalism
telehealth consent, privacy, and emergency planning
equity and access considerations
and emerging concerns around AI use in clinical settings
What I appreciated most was how grounded it felt. It wasn’t alarmist or overly technical — just thoughtful and practical. A reminder that small everyday choices (like how we text, document, or select platforms) carry ethical weight.
Why it mattered to me
Digital communication can quietly create blurred boundaries and constant availability. This course helped me slow down and be more intentional — creating clearer structures that protect both clients and myself.
It reinforced that ethical care isn’t just about compliance.
It’s about clarity, consent, and protecting trust.
The AI Revolution: Pros and Cons for the Social Work Profession
Preferra Insurance Company RRG (ASWB ACE-approved provider)
Completed 2025
I took this training specifically to deepen my understanding of artificial intelligence in clinical work.
AI is entering therapy spaces quickly — through documentation tools, risk assessments, screening algorithms, scheduling systems, and decision-support software. And while these tools can increase efficiency, they also introduce meaningful risks that aren’t always obvious at first glance.
This course focuses directly on those tensions.
It explores:
real-world uses of AI in social work
algorithmic bias and systemic inequities
privacy and data security concerns
overreliance on automation
loss of clinical nuance or human judgment
informed consent and transparency when AI tools are involved
One of the most important takeaways was a simple but powerful question:
Just because something can be automated, should it be?
The training emphasizes that AI should assist clinical thinking — never replace it — and that therapists must maintain human oversight and ethical accountability at all times.
Why it mattered to me
As someone who values both innovation and relational depth, I don’t want to adopt new technologies reactively or because they’re trendy.
I want to understand:
what they solve
what they risk
and who might be unintentionally harmed
This course helped me approach AI with curiosity and caution rather than unquestioned enthusiasm.
Because therapy isn’t just data.
It’s presence, attunement, and relationship.
And those can’t be automated.
Additional Technology & Ethics-Related Trainings
Alongside these longer courses, I’ve continued building my foundation through shorter ethics and technology-focused trainings that support everyday decision-making in practice:
AI & Practice Management: Applications and Implications- 2025
AI Tools to Support Your Private Practice- 2025
Technology in Social Work Practice: Standards of Practice — 2023
Social Work Ethics and “Everyday” Technology — 2023
Behavioral Ethics: A Lens to Examine Ethical Challenges in Social Work Practice — 2023
Ethical Considerations in Surrogate Decision Making (Live Remote) — 2023
Each of these reinforced something I come back to often: ethics isn’t just about major dilemmas. It’s about the small, daily choices — how we communicate, document, protect privacy, and hold power responsibly.
A Personal Reflection
Part of being a therapist, for me, means accepting that this work is always evolving.
New research, new language, new cultural conversations, and new technologies continually reshape how we show up for the people we serve. Staying engaged with that change — learning, adapting, asking better questions — feels less like a burden and more like an ongoing commitment to practicing with care and integrity.
Technology and AI are simply part of that evolving landscape.
And like anything new in our field, I want to approach them thoughtfully rather than reactively — not rushing to adopt everything, but not turning away either. Instead, slowing down enough to ask:
Is this ethical?
Is this supportive?
Is this actually helping people?
Some of this learning is also personal.
As someone who is dyslexic and ADHD, administrative and executive tasks can sometimes feel disproportionately heavy. Writing, organizing documentation, and keeping up with the many small moving pieces of running a practice can quietly become Herculean — not because the work isn’t meaningful, but because many systems weren’t designed with brains like mine in mind.
Thoughtfully using AI has been unexpectedly door-opening for me.
When used with care — and always with strong ethical boundaries and clinical judgment — tools that help me draft blog posts, organize documentation, or structure administrative work feel less like shortcuts and more like accessibility supports. They lower the barrier to starting. They help me stay on top of tasks. They make things that once felt overwhelming feel possible and manageable.
In that way, I experience AI less as replacement and more as accommodation.
Not something that thinks for me — but something that supports me.
And that distinction matters.
Because while I remain cautious about the risks of automation, bias, and overreliance, I also want to honor the ways technology can increase access — including for clinicians with neurodivergent brains like mine.
For me, ethical technology use means both things can be true:
We stay critical and intentional.
And we allow tools to support us where we genuinely need support.
Ultimately, therapy is still human work.
AI may help with the scaffolding around the work — the notes, the drafts, the logistics — but the heart of therapy will always be presence, attunement, and relationship.
That part can’t be automated.
And I wouldn’t want it to be.