News, views and what I choose to dos

ATRT Proposal to ICANN

Category : ICANN, Internet governance · by Jun 18th, 2010

I submitted, along with two professional evaluators, a proposal to act as an independent expert for ICANN’s Accountability and Transparency Review Team, following its call for proposals on 2 June.

I have to admit I was a little surprised and disappointed that we were informed that we would not be asked to present to the ATRT in Brussels. I thought our proposal was very solid. Anyway, so the work doesn’t go to waste, it is pasted below and a PDF of it is also available.

Quick update: I didn’t get any explanation of why our proposal was rejected – but oddly enough it’s possible it may have been too cheap!

The ATRT has set aside a budget of $350,000 to $400,000 for an external evaluator. And then revised that budget to come up with two hypotheses based on the number of meetings they may have. A smaller figure of $189,000 and a larger figure of $323,000. Our estimate came out at $91,000. So either the ATRT expects to have six people (rather than three) working full time on the issue, or it is paying *hugely* inflated rates to external evaluators.

The proposal as a PDF




Value Proposition

ICANN’s Accountability and Transparency Review Team (ATRT) is looking for an independent evaluator to assess the organization’s processes and procedures to ensure they are “designed and executed in a manner that ensures accountability and transparency and reflects the interests of global Internet users”.

The focus of the evaluator’s work will be on “reviewing and assessing the quality of the decision-making as a result of the processes and procedures” that exist within ICANN.

The ATRT has produced a questionnaire and asked members of the community to send it examples of where they believe ICANN’s processes and procedures have not worked effectively, as well as put forward suggestions for improvement.

It has begun to compile information and data about ICANN’s accountability and transparency and will use the evaluator’s report along with other documentation and face-to-face meetings with the community to complete its review before the end of 2010.

The ATRT would like help from an experienced evaluator that also has a good understanding of the unique nature of ICANN in order to review its activities and develop practical improvements. It is expecting the evaluator to provide two progress reports in July and August with a final report before the end of August 2010.

Proposed Solution

First Progress Report

Ersoylu Consulting/Yeast Logic will work with the ATRT, ICANN staff and community members to develop a list of examples of decision-making by the organization that merit in-depth review.

The list will be drawn up through review of responses to the ATRT questionnaire and discussions with key members of ICANN staff and community. Each example will be accompanied by pertinent details and data gleaned from initial interviews and documentation review. From the list of examples, the ATRT will be requested to choose 3-4 examples of decision-making that it would like to have reviewed for a more in-depth case study. The examples can be drawn from across the sample and be based on key variables. For instance:

• One case that is broadly viewed as a success
• One case that is broadly viewed to have fallen short
• One case whose outcome is contentious/ unclear

A dedicated review website will seek input from the community and encourage discussion in order to promote transparency and accountability of the process itself.

Second Progress Report

On each of the chosen cases, a range of qualitative and quantitative data will be drawn from extensive document review and analysis, stakeholder survey, key informant interviews and focus groups.

Interviews with those involved in the cases will be sought (staff, community, and Board where appropriate), documentation surrounding each case attained and reviewed, and the review website will invite broader input and review.

All data will then be compiled, shared, analyzed and initial conclusions drawn out. A feedback session with the ATRT and ICANN staff will be critical to check assumptions and analysis, and begin to draw conclusions and lessons learned.

Final Report

Starting from feedback to the second progress report and working closely with ICANN staff and the ATRT, additional data and analysis will be carried out. A second round of feedback sessions will help develop initial conclusions into final conclusions. Those responsible for implementing any subsequent recommendations will be presented with the analysis and encouraged to develop their own recommendations in facilitated feedback sessions, in order to ensure that conclusions and lessons remain grounded in context.

A final set of conclusions and recommendations will be provided to ICANN staff with a request for official response, with both then provided to the ATRT in the completed final report.

Ersoylu Consulting/Yeast Logic’s unique qualifications

Ersoylu Consulting is a small, woman-owned enterprise based in Southern California. Ersoylu Consulting’s lead, Dr Leah Ersoylu holds a degree in Environment & Resource Economics and a PhD in Political Science, with a focus on nonprofit organizations and public policy. Both qualifications equip her with a core understanding of many of the issues at the heart of ICANN’s processes and procedures.

Dr Ersoylu has over 10 years of non-profit experience at both the programmatic and management levels, and has worked with numerous community-based organizations. She lectures on Nonprofit Organizations, American Government, Urban Politics and Public Policy Analysis at University of California, Irvine and California State University, Long Beach.

As such, she is ideally positioned to understand the different cultures and approaches that exist within the ICANN model.

Sarp Ersoylu holds a Bachelors degree in Mechanical Engineering and a Masters in Business Administration (MBA). He is a registered civil engineer and has nearly 10 years of experience in managing large-scale infrastructure projects, from design to delivery.

He is an expert in Critical Path Management (CPM) scheduling – a highly valued method of project management – and has extensive budgeting and project management experience.

Mr Ersoylu brings with him an in-depth understanding of the technical and business pressures on ICANN, as well as expertise in managing complex multi-dimensional projects.

Yeast Logic’s founder, Kieren McCarthy holds a Masters degree in Mechanical Engineering with Management. Between February 2007 and November 2009, he acted as ICANN’s General Manager for Public Participation.

As a long-time observer and participant in ICANN’s processes from both the outside and the inside, he possesses an invaluable understanding of the complex environment that ICANN operates within, as well as real-world experience of the organization’s processes and procedures.

Mr McCarthy was a major contributor to each of the three previous efforts aimed at addressing transparency and accountability within ICANN. He acted as a contact point, staff resource, as well as author and copy editor to the One World Trust report in 2007, Accountability and Transparency Frameworks and Principles in 2008, and the Improving Institutional Confidence consultation in 2008 and 2009.

Dr Ersoylu and Mr McCarthy are members of the American Evaluation Association; Mr McCarthy is also a member of the European Evaluation Society. All evaluators are based in California, close to ICANN’s headquarters in Marina del Rey.

Pricing Details

Professional Services

Stage Weeks YLK ECL ECS Total hours Cost
Rate 175 175 175
1st Report 3 80 50 20 150 $26,250
2nd Report 2 60 70 40 170 $29,750
Final Report 3 70 80 50 200 $35,000
Total 210 200 110 520 $91,000

Travel costs and expenses

The project foresees four trips to ICANN’s headquarters in Marina del Rey, one to Washington DC and one to Brussels. Exact costing will vary depending on dates, but an initial estimate based on current rates is $8,000.

Out-of-pocket expenses will be billed at actual cost in additional to professional fees.

Billing Terms

An initial retainer of 15 percent will be due at contract signing. Thereafter, billing will be twice monthly based on hours worked and any expenses incurred, except in the last invoice when the initial retainer will be considered.

If ICANN or the ATRT request any changes in the scope of the project, the additional costs and effort will be estimated. All changes, including the time to assess the impact and cost of the request, will be billed on a time basis at the project rates.


We are proud to put forward our proposal for an independent evaluation of ICANN’s processes and procedures. The details contained therein are an initial approach and we remain open to suggestions for improvement or enhancement.


One of the most crucial aspects in any effective evaluation is a clearly identified scope for the project.

In this case, we have taken sections from two paragraphs in the Request for Proposals produced by the Accountability and Transparency Review Team (ATRT) as specifying precisely what that scope is.

The RFP affirms that the intent of the review is to “identify whether ICANN’s processes and procedures are designed and executed in a manner that ensures accountability and transparency and reflects the interests of global Internet users”.

Later on, the RFP specifically notes that “the ATRT is not seeking an audit of whether processes and procedures are in place (i.e., a Sarbanes-Oxley audit), but rather a focus on reviewing and assessing the quality of the decision-making as a result of the processes and procedures.”

The document then references a previous study, series of adopted frameworks and principles, and consultation as earlier work done with respect to ICANN’s accountability and transparency.

It also draws attention to a current public consultation , run by the ATRT, regarding ICANN’s current practices and procedures.

What the evaluation covers

Taking these indications into account, we propose that this evaluation focus on practical issues – rather than ideological, sociological or economic – and so concern itself with evaluating a finite number of decision-making cases within ICANN.

These examples will then be evaluated with respect to five criteria:
1. Overall accountability of the decision-making process. In particular, how far the processes and procedures followed considered and incorporated the existing guiding documents, frameworks and principles
2. Overall transparency of the decision-making process
3. The “quality” of the end result. By “quality” we refer to:
The degree in which the interests of global Internet users have been met
The degree in which the Board members were in consensus with the decision
The degree to which staff was successful in implementing the result
The degree to which the decision helped to further ICANN’s mission
The degree to which the decision impacted key stakeholder or partner relationships (either negative or positively)

Through this assessment of case studies, the evaluation will then identify common threads that inhibit or otherwise disrupt the goal of providing a transparent and accountable process that produces a result of acceptable quality that rests in the overall interest of global Internet users.

Through review and discussion of those common threads, the evaluation team will then work closely with ICANN’s primary decision-makers to identify workable improvements in the organization’s processes and procedures.

What the evaluation does not cover

ICANN plays an intriguing and ever-changing role in the broader context of governance of the Internet, particularly with regard to its “multi-stakeholder” nature and the level of autonomy it enjoys globally. The organization is also remarkable for the ongoing reviews to its own structure.

Neither of these components will be considered by this evaluation, whose focus will be purely on reviewing decisions made by ICANN in the context of the organization’s own stated goals and standards.

The evaluation will also not:

• Evaluate the performance of any individual or group
• Consider the introduction of new processes or procedures
• Investigate any disagreements or disputes that occur outside of the organization’s own processes


The RFP states that the purpose of the evaluation is to provide “an assessment of the decision-making at the Internet Corporation for Assigned Names and Numbers”.

In any evaluation, however, there are inherent questions related to best practice guidelines that must considered:

• What are the questions that the client wants answered through the evaluation?
• Once generated, how are people going to use this information?

The context of this evaluation is as part of a series of new reviews written into an “Affirmation of Commitments” signed between ICANN and the US governments in September last year. That Affirmation replaced the previous “Joint Project Agreement” (JPA) between the two parties.

The Affirmation of Commitments gives ICANN greater autonomy to act without direct oversight of the United States government. A series of four reviews are intended to replace that oversight role, providing “global Internet users” with a means to ensure that the organization continues to work in their best interests.

In this context, there are several implicit elements to this evaluation:

• That it should highlight the positive aspects of ICANN’s processes and procedures as well as the areas that could be improved
• That it should be accessible and understandable to all Internet users
• That it should consider the needs of all Internet users, regardless of their engagement in existing processes
• That it should provide useful and actionable information for improvements to current processes and procedures
• That it should provide an objective analysis by independent entities

Timeline and constraints

The timeline outlined in the Request for Proposals is very aggressive. With the contract awarded, a first progress report is expected less than three weeks later. A second progress report is then requested two weeks after that, with the final report delivered three weeks subsequent to the second report.

Considering the self-acknowledged complexity of ICANN as an organization, the fact that the window falls in the middle of the summer holidays for most of ICANN’s community, and the broad nature of the evaluation itself, this is a very tight timeframe in which to provide a workable final report.

We believe that, given the extensive knowledge of ICANN by one member of the evaluation team, a good overall result can still be achieved. Even so, effective project management will still require several considerations to be closely observed by both the ATRT and ICANN:

• Immediate provision of relevant documents
• Timely responses to requests for information
• Direct access to ATRT members and all members of ICANN’s senior management
• Proactive outreach by both the ATRT and ICANN to relevant stakeholders

We consider it critical that a separate review website be set up specifically to provide interested parties with a simple and dedicated online resource for the evaluation. We have incorporated the design and staging of that site into our work estimates.

In terms of providing useful and workable results in the timeframe presented, we have proposed a targeted case study method that will allow us to rapidly draw out key issues and lessons about ICANN’s decision-making processes and procedures.

By exploring a limited number of diverse, specific decisions made through the ICANN process, it will be possible to analyze and evaluate the organization from a practical and time-sensitive perspective as well as draw out conclusions that will be applicable to future processes and procedures.

Due to the time window for the review, it will be not possible to carry out a cost-effectiveness study of the processes or procedures, or the additional cost-benefit of making improvements to them.

It will also only be possible to do a very limited review of participatory engagement procedures. We will provide cursory suggestions as to whether the procedures themselves are effectively engaging those that should be included, and what could be done to maximize the opportunities for involvement in those not currently engaged in the processes.


The Accountability and Transparency Review Team has set the focus for its work through observing specific examples of decision-making in its “Questions for the ICANN Community” document, released 18 May 2010 .

Most of the questions ask community members to identify, with precision, examples of times when they feel ICANN has fallen short in its accountability and transparency principles, or to identify specific steps or recommendations.

We have taken this document as a procedural guide in designing an effective evaluation methodology given the time constraints. As such, we propose to not only follow this focus on 3-4 case studies but to also use the results of the questionnaire itself as a starting data point.

We differ from the questionnaire’s approach in one key element however: whereas the questionnaire seeks precise examples of where ICANN is not performing adequately i.e. it is seeking to find information with which to perform a critical analysis, we believe that in achieving the goals of the evaluation it will be necessary to also review at least one case where the end result and process are viewed favorably by the broader community.

In this way, it will be possible to compare processes and procedures that are seen to have been successful alongside those that are seen to have fallen short. This comparative analysis is likely to yield results that are more broadly applicable to ICANN overall.

First progress report

The broad outline for the first progress report will be to identify and share with the ATRT a range of decision-making cases (ideally 7-15 cases) from which to choose a limited number (3-4) for in-depth analysis.

The first step in this process will be to review questionnaire results in order to identify any decision-making cases that are repeatedly referenced by members of the community as examples of a failure in accountability or transparency on the part of the ICANN. It may be that the end result of these criticized cases is acknowledged as being positive, however it can be expected that they are the focus of criticism because there was not consensus about the quality of the outcome.

It will be necessary to also talk to ATRT members, ICANN staff and community members to elicit cases of positive decision-making – where the processes and procedures were seen to work and produce a viable and positive result. It is likely that ICANN’s staff will be the best source of such examples, but it is also possible that community members wish to promote examples of when they feel ICANN processes have worked.

In amid these examples of good and poor cases, there will likely be a multitude that gravitate somewhere between the two.

In the first progress report, the intent will be to provide the ATRT with a range of these examples along with sufficient detail to allow for an informed decision about which 3-4 cases to examine in greater detail.

A matrix will be prepared for the ATRT that describes each case using a list of key governance and process variables including but not limited to the following:

• Consensus about the value of the end result (good, bad, mixed)
• The relative efficiency and effectiveness of the procedures and processes used to get to the end point (fast, slow, too fast, too slow)
• Relevant data sets – and data that can be expected to be forthcoming with greater analysis
• The relative size and importance of the process (small, focused process with limited number of affected stakeholders; or large process with broad appeal and response)
• Any information that make the examples a particularly good or relevant case study in terms of a broader review of ICANN’s processes

Faced with this list of between 7-15 cases, the ATRT will be encouraged to pick just three for in-depth review and analysis. The diversity of the 3-4 case studies can be based on whichever variables the ATRT deems most critical for analysis. However, our suggestion at this time is to consider choosing cases based on the following criteria:

• One case that is broadly viewed as a success
• One case that is broadly viewed to have fallen short
• One case whose outcome is contentious/ unclear

At this stage, it is not possible to predict which examples are likely to yield large amounts of analyzable data and/or widespread community interest. However as examples of the sorts of processes may arise, here is a list of potential candidates:

• Revision of the Registrar Accreditation Agreement
• Changes to the Add Grace Period for domain names
• Trademark Protections in the new gTLD process
• Whois
• Redelegation requests
• Inter-registrar transfer policy
• Uniform Dispute Resolution Process
• Internationalized Domain Names

Second Progress Report

Once 3-4 representative cases have been chosen (it will not be possible to review more than this given the time constraints), the evaluation will then move on to gathering the case study data.

For this, a range of qualitative and quantitative data drawn from extensive document review and analysis, stakeholder survey, key informant interviews and focus groups will be used.

Each approach has its advantages and shortcomings, summarized briefly below. Of particular focus for the evaluation team will be scheduling interviews with relevant stakeholders as well as designing appropriate online response tools to allow for broader information gathering.


By reviewing all documentation related to each identified case, we will identify the relevant stakeholders that were critical to each case study. These individuals will be contacted to participate in in-person interviews. These individuals may be staff, Board or community members. In order to gain perspective on how those not intimately tied to that decision-making process viewed the decision, we will conduct a survey of other individuals. This will be done either through the website or through another forum that the ATRT deems appropriate—to garner their opinions on the decision-making process.

A summary document of this information (bearing in mind considerations of confidentiality – see below) will be made available online, allowing for broader peer review and discussion on each case.

Once the data from the summary document (and online comment) has been compiled, the second report will be shared with ICANN staff. The ICANN staff, through a series of feedback sessions, will have time to provide one more round of analysis and feedback on the comparative case study that was presented. Following the feedback sessions, the evaluators will incorporate the suggestions and issues raised, beginning the analysis that will comprise the final report.

Final Report

The second progress report will provide a baseline comparative analysis of the 3-4 targeted case studies. Once ICANN staff and ATRT participate in feedback sessions reviewing this comparison document, the process of identifying lessons learned and drawing out conclusions for the final report can begin.

At this final round of analysis, the need for more data or further analysis will be clear. Using this feedback, and working closely with the primary decision-makers – in this case, ICANN staff, the ATRT and community leaders – it will be possible to tease this out.

In order to make the end result as effective as possible, those responsible for implementing any subsequent recommendations will be presented with the analysis and encouraged to develop their own recommendations in facilitated feedback sessions, with the results from that fed back into the process.

By allowing those responsible for running the process to have a direct say in suggestions for changing it, it is possible to put recommendations into a context that includes values, experience and likely resources i.e. achieve a pragmatic goal. Public review on the accompanying website of this process should help identify remaining divergences or areas where consensus is absent – which can then be noted in the report.

Following best practice, a final set of conclusions and recommendations will then be provided to ICANN staff with a request for an official response, which can then be appended to the final report and presented to the ATRT for consideration.


Evaluation promotes dialogue and improves co-operation between participants at all levels in the development process through mutual sharing of experiences.

Since ICANN is an Internet organization it can assume a high level of online fluency in its community members. And considering the global nature of ICANN’s work, as well as the fact that the review concerns itself with transparency and accountability, it is vital that the review itself conform to modern Internet requirements – that information is easily accessible online at all times of day, and that people are able to comment and discuss that information at their own leisure and with a minimal amount of administration.

For this reason, we propose setting up a dedicated review website that is focused on two goals:

1. Immediacy of information regarding the review, and
2. Simple, open methods of providing input and discussion

The website will be an intrinsic part of the review, with comments and discussion on it given a similar weighting to other forms of input gathering. Those that wish to interact with the review, even to the extent of interpreting the gathered data will be encouraged to do so.


Although the review is focused on accountability and transparency, and the evaluation team advocates publishing its own data and analysis in order to gather ongoing feedback, it is also necessary to recognize that some information gathered during the course of the process will be confidential in nature. Where it is possible, this material will be made available, although honoring respondent confidentiality.


As discussed earlier, the intent of the evaluation is to provide the ATRT with workable, verifiable and useful information through which it can understand ICANN’s processes and procedures better and which point to possible improvements in those processes and procedures in the wider interests of global Internet users.

Although it is too soon to specify what sort of conclusions and recommendations will result, it is worth addressing the issue of post-evaluation review now, since this is a crucial factor in making any evaluation effective.

In order to having a lasting impact, it is imperative that any evaluation is itself reviewed some time period after it has been delivered – typically 12 months – in order to measure the degree to which it has been adopted (or not adopted), and the impact subsequent changes have had. In this way, an organization is able to learn and adapt to its own blindspots and cultural shortcomings.

Conversely without such a review, an organization may find it repeats the same mistakes several times over.

The Evaluators

Yeast Logic

Yeast Logic provides communication, strategic, evaluative and social media advice and services for companies in the Internet arena. We specialize in engagement of online groups, as well as expert advice on advocacy, media, strategic campaigns and issues surrounding Internet governance.

Online participation
With more customers, community members and competitors engaging with organizations online, we offer advice and training on how to engage with people and groups online and well as encouraging participation and interaction through online tools.

Strategic advice
Whether seeking to persuade regulators, stakeholders or external evaluators of a particular case with a definite goal in mind, we can provide strategic guidance and advice on how best to achieve that aim, including the use of the media, online campaigning and face-to-face negotiation.

Ersoylu Consulting

Ersoylu Consulting provides policy research & analysis, program design & evaluation and strategic program management services for non-profit organizations, public agencies and other institutions. We specialize in helping our clients accurately research & evaluate issues, make effective policy decisions and attain their program goals.

We set achievable and realistic outcome measurements when conducting program analysis, work-plans or dynamic logic models. Our expertise is in ensuring full participation of stakeholders in program design, interpretation of research and evaluation findings and policy analysis.

Policy Research & Analysis
We conduct careful research, analysis and evaluation of governance and policymaking processes to support community efforts. We use both quantitative and qualitative methods to produce timely, relevant data for organizations.

Program Design & Evaluation
We bring various analytical tools to bear on organization’s programs and projects. We have experience in both qualitative and quantitative methods such as focus group facilitation, survey data, process and outcome evaluation, community assessments, movement and coalition building and general technical support to advocacy efforts.

Planning Services
Strategic Planning, Organizational Analysis and Charrettes are all ways that local governments and agencies can ensure that they are best – using their limited resources in efficient and effective ways. We are able to provide these services on various scales. Governance and accountability are key issues that successful organizations must have consensus and clarity on in order to have a maximum impact. Community participation is a critical element to this; we help non-profit organizations and public institutions best respond to these needs in a sophisticated and timely manner, ensuring maximum stakeholder participation in the process.


Ersoylu Consulting

Dr. Elisa Nicholas
Executive Director
The Children’s Clinic (Memorial Hospital)

Mr. Gerardo Mouet
City of Santa Ana Parks & Recreation Agency
Yeast Logic

Paul Levins
Former Vice President of Corporate Affairs, ICANN

Maxwell Cooter
Former editor, Techworld


Leave a Reply

Your email address will not be published. Required fields are marked *