[lookit-research] Lookit update, spring/summer 2019

Kim Scott kimscott at mit.edu
Wed Sep 4 13:28:52 EDT 2019


Happy Fall, Lookit friends!


Hope you’re enjoying the start of the semester. Here’s an update on what
we’ve been up to in the past six months.

We’ve primarily been focused on platform development, and have a lot of
progress to report since the last update:

   -

   The “consent manager” tool is live and in use! Researchers can view
   consent videos and mark them as valid/invalid. All permissions to access
   data now take into account the (centrally stored) consent review status;
   researchers can’t accidentally access or use any data before checking for a
   statement of informed consent. In a similar vein, in the rare cases where
   parents choose to withdraw all permission to use video at the end of a
   study, those videos are automatically made unavailable and deleted. These
   are some of the features we prioritized to reduce the potential for human
   error in data handling.
   -

   Families can see their own videos right away after participating, to
   check everything worked and what their kids were up to! And researchers
   can easily leave friendly feedback from the Lookit platform. This both
   helps make participation more rewarding for families and aligns with our
   ideal of respecting families as partners in discovery.
   -

   Families can now indicate languages their child speaks and some conditions
   and characteristics with checkboxes when they sign up, paving the way
   for research with special populations and eventually hosting studies in
   more languages.
   -

   Researchers can flexibly describe eligibility criteria for their studies
   using a boolean expression, referencing the child’s age, gestational age at
   birth, language background, and other characteristics.
   -

   Email functionality is much improved--it’s easier to select the
   appropriate participants, and emails sent via the Lookit platform are
   stored and downloadable by researchers.
   -

   Rico’s currently working on a recruitment dashboard to support
   evaluation of outreach efforts, showing various trends over time in how
   many families and kids are accessing Lookit and participating in studies,
   how old kids are, demographics of families, how they heard about Lookit,
   etc.


We’ve also been expanding functionality for the individual studies, based
on needs that have come up in beta testing:

   -

   Study frames now allow *setting parameters based on previous data* from
   the same session and child characteristics, allowing for conditional
   branching, personalization of stories or instructions, continuing training
   until some criterion is met, etc.
   <https://lookit.readthedocs.io/en/develop/researchers-condition-assignment.html#conditional-logic>

   -

   Webcam recording can either be conducted within individual frames
or *session-level
   recordings* can be made
   <https://lookit.readthedocs.io/en/develop/researchers-create-experiment.html#recording-webcam-video>
   by saying which frame to start and stop recording on
   -

   A child assent form
   <https://lookit.github.io/ember-lookit-frameplayer/classes/Exp-lookit-video-assent.html>
   (which can be shown only for children of a specific age and up, if desired)
   supports a standard assent workflow with multiple segments of pictures and
   text or audio/video explanations.
   -

   It’s easier to substitute values throughout a study
   <https://lookit.github.io/ember-lookit-frameplayer/classes/Exp-frame-base.html#property_parameters>
   and to make groups of frames
   <https://lookit.readthedocs.io/en/develop/researchers-create-experiment.html#frame-groups>
   .


Our beta testers are continuing to try out and provide feedback on the
platform, and the first few studies have been completed! Here’s the current
status....

   -

   “Mind and Manners” (Erica Yoon, Mike Frank): complete and included
in Erica’s
   CogSci paper <https://psyarxiv.com/r9zf4>
   -

   “Flurps and Zazzes” (Lisa Chalik, Yarrow Dunham): completed first study,
   collecting another round of data
   -

   “Baby Euclid” (Molly Dillon, Liz Spelke): completed first study,
   preparing a conceptual replication
   -

   “Labels and Concepts” (Bria Long, Mike Frank): completed data
   collection, analyzing
   -

   “Look and Listen” (Halie Olson, Rebecca Saxe): data collection ongoing
   -

   “Your Baby, the Physicist” (Junyi Chu, Liz Spelke): data collection
   ongoing
   -

   “Baby Laughter” (Caspar Addyman): data collection ongoing
   -

   Several more studies are under active preparation to start testing:
   action planning in teens with autism (Pawan Sinha, MIT); neonatal imitation
   at home (Laurie Bayet, AU); and approximate numerosity judgments in deaf
   and hearing-impaired children (Stacee Santos, BC).


Good news and bad news on *funding* (we’re only partway back to the drawing
board).

   -

   Lookit will likely be included in a DARPA grant to develop AI systems
   that reach specific target developmental milestones (on the basis that we
   should know more about how human children behave if we want AI to behave
   like them!)
   -

   Our application to the Spencer Foundation was rejected, as were several
   collaborative proposals we were part of (e.g. NSF mid-scale infrastructure
   for online research, Caplan Foundation for the neonatal imitation study).


   -

   *We would be happy to hear about ideas for collaborative proposals* from
   folks who would like to run a particular project on Lookit, even ahead of
   the official launch.


Legal and logistical news: Research on Lookit is now approved via researchers’
own IRBs, after they sign an institutional access agreement. Six
institutions have approved the agreement so far, and we haven’t run into
any major issues besides delays at MIT. To ensure we’re all on the same
page about the agreement, there’s now an informal quiz about the Terms of
Use <https://forms.gle/dZSJtyREMBBaTSnP7> to submit along with the signed
agreement. (Feel free to try it out - feedback is welcome!)

MIT’s Quest for Intelligence “Bridge” program is evaluating OpenGaze
<https://perceptual.mpi-inf.mpg.de/opengaze-toolkit-released/> as a
starting point for automated gaze coding of developmental video, using
datasets from Lookit and from Virginia Marchman’s lab. This has been slow
to get started in part because they’re working with undergrad RAs; we’re
interested in what it would take to get someone dedicated to this project.

Also I had a baby, Keoni, who joins her very proud brother and sister.
(That's where your spring update went.)

*Next steps:*

   -

   I’m working on a tutorial introduction to using Lookit, so that new
   researchers can set aside a known amount of time to work through
   step-by-step exercises and end up ready to put their own studies online.
   -

   I’ll be at the “Open Developmental Science
   <https://cogdevpreconference.wixsite.com/opendevscience>” preconference
   at CDS to present a workshop on Lookit. Let me know if you want to meet
   up sometime during CDS!
   -

   In parallel with the next features to work on, Rico will be working
on transferring
   hosting over from the Center for Open Science and setting up a security
   audit before launch.
   -

   We’ll have an undergrad RA working this term on a comprehensive survey
   of recruitment and advertising options, selecting a few avenues to
   explore in depth.
   -

   We’re on track for launching on schedule (September 2020) or possibly
   sooner - we’re excited to build momentum and start growing a community of
   users.


*Learn more / get involved:*

   -

   Information about the current status of the project, our longer-term
   plans, how IRB approval works, etc. is available on the "research-resources"
   Github repo and wiki <https://github.com/lookit/research-resources/wiki>
   .
   -

   Overall documentation <https://lookit.readthedocs.io/en/develop/> for
   using platform, specific experiment frame docs
   <https://lookit.github.io/ember-lookit-frameplayer/modules/frames.html>
   -

   Development planning is organized on Github Issues on the various
   Lookit-related repositories <https://github.com/lookit>. Check out
   what’s planned when under “milestones,” add your own feature requests, or
   pick something to work on!


---
Kim Scott
Research scientist | Early Childhood Cognition Lab | MI
*T*W: www.mit.edu/~kimscott | Participate: https://lookit.mit.edu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/lookit-research/attachments/20190904/b605ea46/attachment-0001.html


More information about the lookit-research mailing list