2006 Federal CASIC Workshops
 
FedCASIC Workshops Presentations By Year:
 
Dates: Tuesday March 14 through Thursday March 16, 2006.
Place: Bureau of Labor Statistics Conference Center, Postal Square Building,
2 Massachusetts Ave., Washington, D.C. 20212
Sponsors: The Bureau of the Census and the Bureau of Labor Statistics
 

Workshops Program

Program Opening Day - Tuesday, March 14, 2006

A. Plenary Sessions (Tuesday 9:00 am to 12:00 noon)

The meetings will begin with a brief welcome and announcements by John Bosley and two 80 minute, consecutive, plenary sessions.

P-1. Opening Keynote Plenary: Computer-Assisted Advances in Sample Design.

Colm O’Muircheartaigh, Professor, University of Chicago, and Senior Fellow, National Opinion Research Center.

P-2. Panel Plenary: Mixed-Mode Survey Design. What Are the Benefits and Costs?

Chair: Bill Mockovak <mockovak.william@bls.gov>. A Panel of Experts to be chosen from both government and private survey agencies will present their views. Time will be saved for audience participation.


B. Concurrent Sessions (Tuesday 1:30-4:30 pm)

B-1. Recent Innovations and Lessons Learned at Participating Organizations

This session has replaced the traditional Round Robin Organizational Reports. Following the model of past year’s approach, the organizational reports will be voluntary. Only organizations that have recent innovations to share with their colleagues are asked to report. Because presentations in this session are generally limited to 10 minutes each, we ask that they be focused on true innovations. Descriptions of new or continuing surveys using familiar CASIC methods may be distributed as handout supplements rather than part of the verbal presentation. The innovations can be in organization, types of surveys undertaken, software, hardware, communications, training, research, or what have you. We are hopeful that at least the larger Federal survey agencies and Federal contractors will have something to contribute, especially on topics that do not fit into one of the half-day technical sessions. If nothing new is happening at your organization, come and see what other organizations are doing.

Schedule of Speakers:

TimeTopicPresenter(s)
1:30pmBLS and Census -- IntroductionNicholls & Bosley
1:35pmBureau of Labor StatistcsBill Mockovak
1:47pmMathematicaMark Pierzchala
1:59pmStatistics CanadaSusan Lensen
2:11pmSRC-MichiganPatty Maher
2:23pmMarket StrategiesReg Baker
2:35pmBlaise- Statistics NetherlandsLon Hofman
2:50pmBREAK
3:05pmCensus BureauCheryl Landman
3:17pmWestatJim Smith
3:29pmResearch Triangle InstituteKaren Davis
3:41pmNORCJudy Petty
3:53pmORC MacroRobert Dana
4:05pmCASES - U.C. BerkeleyTom Schnetlage
4:20pmSummaryBill Nicholls

B-2. Software and Application Demonstrations

This year we will continue to offer demonstrations of CASIC instruments and software in a mini exhibit hall setting, where attendees can move among exhibitors throughout the demonstration period. Space is anticipated for nine (9) concurrent demonstrations.

Only representatives of Federal agencies or Federal survey contractors may make presentations. Software vendors may participate in demonstrations only when invited by a Federal agency or Federal survey contractor to assist in its presentation.

A list of presenters should be available prior to the workshops and posted here.

Coordinator: Louis Harrell <harrell.louis@bls.gov>.

B-3. Survey Uses of Metadata

This session will concentrate on metadata for questionnaire design and the interview process. Generally, statistical metadata are data that describe some aspect of the survey life-cycle or statistical data. They convey information to someone about a survey and its components and can be used as parameters to drive automated processes. Metadata are data, so they can be managed in a database. Metadata management is used to preserve corporate memory (i.e., an aspect of knowledge management.); provide a record of past designs and processes in order to enhance survey redesign; help harmonize or even standardize concepts (e.g., question wording) across surveys and time to improve survey quality; and provide a reference for people needing to understand statistical data.

Presentation Materials:

Coordinator: Dan Gillman <gillman_d@bls.gov>.


WEDNESDAY MORNING – March 15, 9:00 am -12:00 noon
Concurrent Sessions

WA-1. Implementing, Designing and Evaluating Web-based Surveys.

Web-based surveys have started to mature, with common capabilities, standards and practices starting to take shape. Minimally, a basic understanding that it takes more than a Web server and an intern or graduate student to field a Web-based survey, has sunk in. In this session, we will explore the maturation of Web-based surveys with a variety of topics that will be expected to raise some very good discussion. Presenters will discuss their experiences with the practice of implementing Web-based surveys, the designs being implemented, and how quality is being evaluated. This session will close with a panel discussion with all presenters and the audience. Audience experiences and input will be strongly encouraged.

TimeTopicPresentersMaterials
9:00 - 9:05amWELCOME / INTRODUCTION
9:05 - 9:30amThe Impact of Respondent Computer Environment on Web Questionnaire DesignMark Pierzchala, Mark Brinkley, Leonard Hart, Scott Reid, and Chris Rankin (Mathematica Policy Research, Inc.)presentation (Adobe PDF Icon, 230kb)
9:30 - 9:55amEvaluating a Web-based Establishment Survey of U.S. Academic Institutions using a Web-based Response Behavior SurveyEmilda Rivers (National Science Foundation), Scott D. Crawford (Survey Sciences Group, LLC)presentation (Adobe PDF Icon, 184kb)
9:55 - 10:20amCES Web Data Collection: A Simplified VersionRichard Rosen, Antonio Gomes, Louis Harrell, Jason Chute (Bureau of Labor Statistics)
10:20 - 10:35amBREAK
10:35 - 11:00amWeb data integration: The Collection SideSylvain Hayes (Statistics Canada)
11:00-NoonPANEL DISCUSSION / QUESTIONS

Coordinators: Scott Crawford <scott@surveysciences.com>.

WA-2. Best practices in CAI Authoring.

This session will discuss best practices for the development of CAI instruments. We will discuss a variety of methods, including automation, that can be used to reduce development burden and increase the quality of fielded software. Topics will include specification development, development management metrics, and testing.

TimeTopicPresenters
9:00 - 9:10amIntroductions
9:10 - 9:40amAutomated Testing of the Census CFU InstrumentGilbert Rodriguez and Bharathi Jayanthi (RTI)
9:40 - 10:25amA Metadata Authoring Tool for SurveysDebra Reed-Gillette, Shannon Corcoran, Michael Volynski, Ph.D. (DHNES)
10:25 - 10:40amCoffee break
10:40 - 11:10amAutomated Testing at Statistics Canada: The Compuware SolutionMichel-Éric Velleman (Statistics Canada)
11:10 - 11:35amQuestionnaire Development within ONSColin Setchfield (ONS)
11:35 - NoonRound Table Discussion: Common Problems During Application Development

Coordinators: Charlotte Scheper <cscheper@rti.org> and Mike Egan <mike.egan@statcan.ca>.

WA-3. CASIC Methods for Establishment Surveys.

This workshop will look at the application of CASIC methods to establishment surveys and include presentation on current methods of mass data capture.

Rob Dymowki (Westat) will discuss NAEP E-filing, an online application that allows schools, school districts, and state education departments to upload files of student information. These files are then standardized by the user informing us about the contents of the submission. The user maps their variable names to ours and their variable definitions to ours. This is done for about 10 variables. During the process, the application is running QC checks to help ensure solid data.

Jim O'Reilly (Westat) and Lon Hofman (Statistics Netherlands) will discuss BASIL (Blaise Application for Self-Interviewing), including the general Blaise approach and its development using Delphi and Blaise elements (language, routing and checking, data storage). BASIL system is a Windows application with flexible layout and is easy to download and install.

Jamie Isaac (RTI International) will describe Strategies for Contacting Participants and Enhancing Data Quality, lessons learned from a large ongoing web-based collection of postsecondary institutions data.

Coordinators: Daniel Wellwood <daniel.wellwood@census.gov> and Shawn Eaton <shawn.l.eaton@census.gov>.

WA-4. Usability.

We believe it is now generally accepted (at least in theory) that design of CASIC systems and instruments should be user-centered, that is, focused on system and instrument users and how easy it is for them to perform their required tasks. To that end, many agencies and organizations have developed laboratories, tools, and procedures for evaluating the usability of systems and survey instruments and to improve their design.

In this session, we have asked participants from different organizations and agencies to tell us what they have been doing recently to apply usability principles and testing procedures to design and evaluation of surveys and systems. These include evaluations of a Web self-administered survey and a browser-based CAPI survey. Presenters will describe design challenges, procedures and tools used, and usability test findings, and there will be time for open discussion about issues raised.

Presentations:

Coordinators: Sue Ellen Hansen <sehansen@umich.edu> and Kath Straub <kath.straub@gmail.com>.


WEDNESDAY AFTERNOON – March 15, 1:30 – 4:30 pm
Concurrent Sessions

WP-1. The Web in Survey Research - Going Beyond the Simple Web Survey.

As the Web-based survey matures as a mode of data collection, we find that the Web in general is also pushing us into new possibilities for the support of and conduct of self-administered data collection. Sample management, respondent support, data reporting, collecting new kinds of data, are all areas where the Web is now beginning to flourish. So no longer can one be so clearly understood when they say they are involved in Web surveys… it now may mean more than the originally conceived scrolling or interactive form interface that presents a questionnaire to a respondent. In this session you will hear about new innovations and applications, new tools and infrastructures, and a variety of Web related topics that all push the Web to the next level in survey research.

TimeTopicPresentersMaterials
1:30 - 1:35pmWELCOME / INTRODUCTION
1:35 - 2:00pmDistributed CAPI Interviewing with Blaise IS for the Web for the Adolescent Trials NetworkRick Mitchell <rayc1@westat.com> (Westat)presentation (Adobe PDF Icon, 6.6mb)
2:00 - 2:25pmE-Mail Data Collection: Can it Work?Richard Rosen, Hong Yu, Antonio Gomes (Bureau of Labor Statistics)
2:25 - 2:50pmIntegrating Interactive Geographical Maps in Web-based SurveysBrian Hempton, Scott D. Crawford (Survey Sciences Group, LLC), Robert F. Saltz (Prevention Research Center)presentation (Adobe PDF Icon, 42kb)
2:50 - 3:05pmBREAK
3:05 - 3:30pmCustomization in Internet Surveys: Making Large Surveys Seem SmallerJohn M. Kennedy (Indiana University Center for Survey Research)presentation (Adobe PDF Icon, 329kb)
3:30 - 3:55pmNYC Department of Health and Mental Hygiene: Online Panel Management SystemKevin M. Kelly, Steve Young (ORC Macro)presentation (Adobe PDF Icon, 821kb)
3:55 - 4:30pmPANEL DISCUSSION / QUESTIONS

Coordinators: Scott Crawford <scott@surveysciences.com>.

WP-2. Mobile Survey Computing, Including Wireless and Pen Based Environments.

In this session, we will address issues regarding the design and implementation of mobile computing devices in survey data collection. Presenters will discuss lessons learned in implementation as well as results of research efforts.

Coordinator: Jean Fox <fox.jean@bls.gov>.

WP-3. Multi-mode CASIC Surveys.

Multimode CASIC surveys are increasingly supplanting single-mode surveys in an effort to address single-mode limitations including response rate declines, and technological and legal obstacles. Multiple survey modes offer the respondent a choice of mode, and for the survey organization differing channels of communication, and the ability to play off one mode against another. Yet multimode surveys are inherently more complex and challenging to field than their single-mode counterparts.

This session focuses on methodological and technical issues that arise in fielding multimode surveys. Maximizing response rate and reducing (or at least controlling) costs are major themes of this session. Within those themes specific topics may include instrumentation, survey management, survey operations, cost modeling, role of survey personnel, incentives within the context of multiple modes, and data quality and comparability.

TimeTopicPresentersMaterials
1:30 - 1:35pmINTRODUCTION TO SESSION
1:35 - 2:00pmA Multi-mode CATI-Web Survey Experience with BlaiseCraig Ray <rayc1@westat.com> (Westat)presentation (Adobe PDF Icon, 452kb)
2:00 - 2:25pmUsing a Multi-Mode Design to Maintain Response RatesDuston Pope (Market Strategies, Inc.), Amy Vincus, Sean Hanley (Pacific Institute for Research and Evaluation), Scott Crawford (Survey Sciences Group, LLC)presentation (Adobe PDF Icon, 680kb)
2:25 - 2:50pmTelephone Collection as Part of a Multimode SurveyMark Pierzchala, Debra Wright (Mathematica Policy Research, Inc), Claire Wilson (Insight Policy Research), Paul Guerino (Education Statistics Services Institute)presentation (Adobe PDF Icon, 230kb)
2:50 - 3:00pmBREAK
3:00 - 3:25pmImplementing the Canadian Multimode CATI and CAPI Labour Force SurveyEric Joyal <eric.joyal@statcan.can> (Statistics Canada)presentation (Adobe PDF Icon, 8.6mb)
3:25 - 3:50pmChallenges of Designing and Implementing Multimode InstrumentsMelissa Cominole <mcominole@rti.org> (Research Triangle Institute)presentation (Adobe PDF Icon, 2.6mb)
3:50 - 4:15pmPlanning for a Multimode Data Collection System at Statistics CanadaChad Mackinnon (Statistics Canada)
4:15 - 4:30pmDISCUSSION AND QUESTIONS

Coordinators: Mark Pierzchala <mpierzchala@mathematica-mpr.com>.

WP-4. Capacity Planning (time and staff management and related organizational issues).

For organizations running CATI call centres, capacity planning means ensuring that interviewing capacity will be available when it is needed. Workload fluctation (number of surveys, sample sizes) is only one element involved in managing and scheduling interviewers. Unionization leads to additional challenges. Some surveys require interviewers with specialized skills (eg. ability to speak other languages) or with certain characteristics (gender, age). Use of "best time to call" and placing controls on when calls are made to cases can limit the number of cases available at a given time of day, thereby impacting the number of interviewers scheduled to work. On the other hand, information from call history files can be used to better plan work schedules and to identify patterns of low productivity.

This session will be for participants to discuss the challenges they currently face, and the information, tools, and strategies that could be used to help make decisions.

Coordinator: Lecily Hunter <lecily.hunter@a.statcan.ca>.


THURSDAY MORNING – March 16, 9:00 am -12:00 noon
Concurrent Sessions

TA-1. Security Issues in Federal Data Collection.

This session will feature presenters and discussion centered around current issues of security in the data collection environment including such topics as voice over IP; firewalls; security in non-centralized research environments; mobile computing; authentication methods; spam, viruses and trojans; and other related topics.

The session will begin with a broad overview of security issues followed by more focused presentations on specific areas that are of current concern with time allowed for questions and spirited discussion.

Each attendee will receive a free yellow sticky note with room for multiple passwords that you can stick near your computer when you return home.

Presentation Materials:

Coordinator: Bill Connett <bconnett@isr.umich.edu>.

TA-2. New and Emerging Technologies, Including Those Based on XML.

This session will focus on new technologies which offer solutions to persistent survey problems or extend the capabilities of survey systems. Based on the latest developments and available presenters, possible topics may include: collaborative research tools, including portals and other approaches to knowledge management, peer collaboration, and document search, and interactive data dissemination; automated data cleaning, editing and management tools, including current COTS offerings; integrated systems which combine data collection, metadata, and project management; advances in voice computing, including coding and post processing of voice and video recording; voice to text systems; word and phrase lookup in speech recognition, voice browsing and voice/web synchronization; and various messaging technologies.

Coordinators: David Uglow <duglow@rti.org> and Jim Kennedy <kennedy.jim@bls.gov>.

TA-3. Accessibility.

In this session, we will take a look at how Section 508 affects computer-assisted surveys. The session will include:

Presentation Materials:

Supplemental Materials:

Coordinators: Jean Fox <fox.jean@bls.gov> and Jim O’Reilly <oreilly@westat.com>.