EVALUATION
DRA approached evaluation as
information development, meaning its efforts are in
response to clients’ real world information needs. Our
emphasis was on program and organizational refinement, with special
attention to the distinctive factors that more often
than not define organizations and their programs. DRA
found that these distinctive factors are often overlooked
or under-appreciated within more traditional research;
research agenda that often search for simple or more
universal factors aimed at “one size fits all” approaches
to program design, operation and evaluation.
DRA’s evaluation services were
quite inclusive, focusing on an organization’s operations,
programs, structure, processes, outcomes and/or impacts.
With this said, we also stressed efficient designs
where clients’ information development needs are matched
with realistic assessments of clients’ resources and
organizational capacities. Our team demonstrated a firm
commitment to participatory evaluation, where building
clients’ capacities to develop information themselves is each consult’s
primary agenda.
EDUCATION AND TRAINING
All of DRA’s team members
had experience in technical assistance, training,
formal instruction and mentoring. DRA team members also
had great depth and experience in providing technical
support ranging from one-on-one to groups of all sizes.
With DRA’s commitment to capacity-building, training
often became the first step in a continuing process with
each new capacity leading to new aspirations and
achievements; one step at a time. For more formal
training, two of the team had considerable experience
teaching at the university level, and were skillful in assisting even
the most sophisticated, conceptually-oriented clients
with structured curricula.
Though team members continued
with formal training and education consults, DRA’s
primary interest and depth of experience was in
mentoring and “hands on” training and assistance in
program refinement and evaluation. “Learning by doing”
was key to DRA’s
capacity-building agenda. In practical terms, this meant
that every technical assistance consult was a joint
learning experience, where lessons can be learned by
everyone entering into the educational partnership. From
the team’s perspective, practice in the field was always
desirable: capacities were built, lessons were learned
and work was accomplished.
ORGANIZATIONAL ANALYSIS
Increasingly, DRA was recognized in the field for its abilities
to perform comprehensive organizational reviews, especially
where these reviews involve sensitive issues of organizational
performance. Whether internal reports or reports for
public distribution, DRA increasingly was called to assist its clients with highly tailored
assessments of operations and outputs, including recommendations
for change and improvements.
PROFESSIONAL ETHICS
For DRA’s practice, effective
consulting could not be distinguished from ethical
consulting. In a fundamental sense, evaluation
consulting is akin to professional truth telling, which
in practical terms means that DRA’s work had to refer to
more than simply professional standards. It was obliged
to embody a commitment
to values that define a working relationship with others:
veracity, service, advancing clients’ interests rather
than our own, as well as a primary commitment to the
exploration of visions of “right action.” DRA had deep
experience in teaching and training ethics curricula,
and even deeper experience in confronting and resolving
ethical issues in the field.
CONSUMER SATISFACTION
SURVEYS
DRA’s experience in designing
and applying consumer satisfaction surveys included both large-scale and small scale efforts.
In addition, we pioneered "expectation loaded" surveys,
where clients were asked to acknowledge both particular
expectations as well as the bases for their
expectations. DRA's work included surveys that were
commissioned as stand-alone efforts, but more often were part and parcel of DRA’s
efforts to use a variety of measures to approximate
the true impact of organizations and their programs.
Together with other measurement strategies, such as
clinical outcomes, activity tracking, benefit-cost analyses
and other qualitative and quantitative measurements
DRA assisted clients in determining the need for,
characteristics of, and ultimate value of satisfaction
surveys.
|