May 3, 2024

Apple Details New Features in ResearchKit 2.0

Posted June 14, 2018 at 9:42pm by iClarified · 9288 views
Apple recently detailed some of the new features coming in ResearchKit 2.0 in a blog post on ResearchKit.org.

ResearchKit 2.0 has a whole new look and feel! The UI has been updated across the entire framework to closely reflect the latest iOS style guidelines. Footers are now sticky to the bottom of all views with filled button styles and the ‘cancel’ and ‘skip’ buttons have relocated under the continue button to allow for easier navigation. Additionally, a new card view enhances the look of forms and surveys. All of our updates are aimed at making the experience more enjoyable and intuitive for ResearchKit app users.




Here are some of the new features in ResearchKit 2.0...

New Features
ResearchKit 2.0 includes several new tasks and features that enables researchers to collect information in innovative ways that will lead to more insightful results.

● PDF Viewer: A step that enables users to quickly navigate, annotate, search and share PDF documents.
● Speech Recognition: A task that asks participants to describe an image or repeat a block of text and can then transcribe users’ speech into text and allow editing if necessary.
● Speech in Noise: A task that spans speech and hearing health and allows developers and researchers to assess results on participants’ speech reception thresholds by having participants listen to a recording that incorporates ambient background noise as well as a phrase, and then asking users to repeat phrases back.
● dBHL Tone Audiometry: A task that uses the Hughson Westlake method for determining the hearing threshold level of a user in the dB HL scale. To facilitate this task we have also open-sourced calibration data for AirPods.
● Environmental SPL Meter: A task that enables developers to record users’ current background noise levels during active tasks and set thresholds to ensure users are in the proper environment before completing other tasks.
● Amsler Grid: A task that will instruct participants to hold the phone at a certain distance from their face and then provide instructions to close one eye or the other. As participants progress through the instructions, a grid is displayed for users to view and mark any areas on the grid where they see any sort of distortion.

Users are invited to share feedback and participate on GitHub ahead of Apple's stable release.

Read More