Our Methodology

How We Evaluate Legal Technology and Make Recommendations

The comparison tools on this site — the platform selector, the comparison matrices — provide directional guidance for self-directed research. Engagement-level recommendations go further: they are built around your firm's specific workflows, staff capacity, and existing stack, not drawn from a scoring formula. This page explains what we evaluate, how we build recommendations, and how commercial relationships are handled. If you want to understand our process before engaging, start here.

Evaluation Criteria

What We Look at Before Recommending Anything

Most technology sales start with a demo and end with a contract. We start with the firm's actual situation. These are the nine factors we look at in every assessment.

01

Firm size and practice area fit

Every platform has a real sweet spot. A solo family law attorney and a 12-attorney PI firm have genuinely different needs. Platforms built for enterprise-scale practice often carry complexity and cost that smaller firms cannot absorb. We start here before we look at anything else.

02

Intake and client-response requirements

Where are prospective clients coming from, how quickly does your firm need to respond, and what happens to inquiries that don't get a timely follow-up? Intake design shapes platform selection more than most firms expect.

03

Workflow compatibility

How do your staff actually work today? We map your current process before recommending a replacement. A new platform that ignores existing workflow is a failed implementation waiting to happen.

04

Implementation complexity

Some platforms take an afternoon to configure. Others take six weeks with a consultant. We are direct about which is which, and we account for your team's capacity to absorb a change.

05

Integration and data flow

What tools does the platform need to connect to? We trace the full data flow (intake forms, call answering, CRM, case management, billing) before recommending anything that adds to an existing stack. For law firms, how client information moves between systems carries professional obligations beyond efficiency.

06

Team adoption and ease of use

A system no one uses is a failed implementation regardless of the platform. Staff fluency, training time, and change tolerance are real variables in every recommendation we make. For AI-assisted tools, we also assess supervision requirements and whether staff have the context to use the tool correctly.

07

Reporting and visibility

Can you see your intake funnel? Do you know your sign rate? Visibility requirements affect platform selection: some firms need built-in reporting; others need external tools to get useful data.

08

Cost and operational tradeoffs

License costs, implementation costs, ongoing support, and the cost of switching. We factor all of it, not just the monthly subscription price shown on the vendor's pricing page.

09

Client privacy and data handling

Technology selection is also a data-handling decision. We evaluate how each platform manages client information, what access controls are in place, and whether the configuration respects the firm's confidentiality obligations. For AI-assisted tools, data-use policies and training data boundaries are part of this assessment.

The difference better systems make is measurable: Clio's 2022 research found attorneys at firms using cloud-based practice management were 43% more likely to have satisfied clients than those that did not. Source The association reflects systematic adoption quality, not any single platform's effect in isolation.

The professional expectation is also clear: in the ABA's 2024 technology research, 71% of attorneys said they have a responsibility to understand the benefits and risks of technology as part of competence. Source That is the standard every platform assessment should meet, not just whether a tool has the right features, but whether it is the right fit for how this firm works and what it handles.

AI Tool Evaluation

AI-assisted tools require additional evaluation depth beyond the nine criteria above. When assessing AI tools for client-information use, we extend criterion 09 to cover: whether training-data exclusion is confirmed in writing (not assumed by plan tier), the vendor's data retention behavior and whether deletion-on-demand is available, whether a signed data-handling agreement is accessible, and whether the firm will have organizational admin controls and a named owner after adoption.

We evaluate AI tools independently — we have no affiliate or referral relationships with any AI vendor — and the evaluation criteria are consistent regardless of whether a commercial relationship exists with other platforms we review. For the full AI evaluation framework, see the Legal AI Comparison Matrix →

The Recommendation Process

From Intake Audit to Working System

Step 01

Evaluate current state

We start with what you have: your current software, intake process, client-response workflow, and where opportunities are being lost. The free Intake Audit is the structured version of this step. Without understanding the current state, a platform recommendation is a guess.

Step 02

Identify the bottleneck

Most firms have one or two specific failure points, not ten. Identifying the real bottleneck determines whether the fix is a software change, a workflow change, or both. Skipping this step is how firms end up buying a new CRM when their actual problem is a broken follow-up workflow, or adopting new tools without the implementation discipline required to make them useful.

Step 03

Compare viable options

We look at platforms that genuinely fit your firm size, practice area, and budget. Not a ranked list from a vendor comparison site: a short list of real options with honest tradeoffs, including the ones that cost us referral revenue if we recommend something else.

Step 04

Recommend based on fit

Our recommendation is specific: here is the platform, here is why it fits your situation, here is what implementation requires, and here is what we will still need to solve after the platform is live. You get a written deliverable, not a verbal suggestion. That written recommendation covers the platform with fit rationale specific to your firm, implementation scope and timeline estimate, adoption requirements, integration dependencies, and the gaps that the platform alone will not close. A second option is included where one exists and genuinely fits. It is scoped to what we actually know about your situation.

Step 05

Implement and refine

We do the implementation work with your team: configuration, workflow setup, integrations, staff training, and the part most vendors skip: staying involved until the system is actually being used. Engagements don't end at go-live. In practice, this means regular working sessions with your team during configuration, a defined pre-launch checklist, and a structured post-launch period where we verify adoption and address what isn't holding. We work around your team's capacity. This is collaborative, not delegated — we need access to your stack and your staff, and we adjust scope if the situation changes.

Limits and Commitments

What We Don't Do

Recommend tools because they pay us

Some platforms in our directory have referral relationships with us. Some don't. Filevine is the clearest example: we implement it, we have no referral arrangement with them, and we recommend it where it fits, including over platforms that do pay us. The recommendation comes from the fit analysis, not the fee structure.

Pretend one stack fits every firm

There is no universal intake system. A recommendation that works for a high-volume PI practice will not work for a solo estate planning attorney, and vice versa. We do not have a default stack we push. Every engagement starts with your specific situation.

Separate software selection from implementation accountability

The recommendation is not the job. We stay through implementation, and we define "done" as the system working and being used, not the software installed and the invoice sent.

Transparency

On Referral Relationships

Some platforms on this site have referral arrangements with us. If a firm signs up through a Songbird link, we may receive a fee. Every such relationship is disclosed on the relevant platform page and in full on our disclosure page.

Some platforms (currently Filevine) carry no referral arrangement at all. We include them in our directory and recommend them based on fit, which means we sometimes recommend them over platforms that do pay us.

The reason we are transparent about this is straightforward: if you don't know which recommendations are commercially influenced, you cannot evaluate the advice. We'd rather show you the full picture than have you figure it out later.

Read Our Full Referral Disclosure →

Who This Is For

Firms That Need Real Analysis, Not Another Demo

Firms that have been sold the wrong tool by a vendor whose job ended at the signature

Firms evaluating a platform change and wanting an independent analysis before they sign anything

Solo and small practices that need a real implementation plan, not a demo, not a PDF, a working system

PI and family law firms with intake problems they've tried to solve and couldn't, usually because the fix required both a software change and a workflow change

Firms evaluating AI-assisted tools and needing a clear view of what's ready for law-firm use, what supervision the tool requires, and what to avoid for now

For a concrete example of how an engagement like this works in practice, see our case study →

Start With the Intake Audit

Free. Takes 5 minutes. We analyze your responses and deliver a written recommendation within 48 business hours.

Get Your Free Intake Audit

Takes 5 minutes. No commitment.

Book a Free Strategy Call

30 minutes. No sales pitch.