logo FEDIDEVS

SciPy 2024

Day 4 (7)

We finally got a @jni shout-out in the lightning talks session (for scanning the claw)

1 0 1

@ratamero YASSSS 🦞 🦀 reminder that the scans are available for all to download on Monash's FigShare instance: bridges.monash.edu/articles/da 😃

SciPy 2023 Lightning Talks Claw

The Quest to 3D-image the Claw from the SciPy 2023 Lightning Talks.As co-host of the Lightning Talks at the SciPy 2023 conference in Austin, TX, Madicken Munk injected a little chaos into the process: anyone who had presented a lightning talk in any previous year had to roll a giant die. If the die came up 1, the speaker had to deliver the talk with some chaotic new handicap, invented by Madicken on the spot. [1-3]The handicap often involved this crab- or lobster-claw-shaped silicone oven mitt. In the case of Juan Nunez-Iglesias's talk about Zulip, Juan had to deliver the talk while using the claw as a sock puppet.The claw became a recurring theme and a sensation at the conference, and Juan even delivered the napari update at the Tools Plenary Session with the claw. At that point P.L. Lim asked on the conference Slack, "Can napari visualize a 3D model of The Claw?" — thereby throwing down the claw-shaped, silicone gauntlet.At the end of the conference, Madicken gifted the Claw to Juan, and Juan promised he would endeavour to get it imaged.(First attempt: Juan asked for help on Mastodon [4], and Lachie Whitehead aka DrLachie from the Walter and Eliza Hall Institute (WEHI) in Parkville, Australia offered the WEHI micro-CT. Unfortunately, at 110mm in height, the Claw was too big for the micro-CT. The search continued.)Juan contacted Olga Panagiotopoulou, a researcher at the Department of Anatomy and Developmental Biology at Monash University who had previously showed him very cool 3D scans of skulls, jaws, and feet, both human and otherwise. And she referred him to Michael de Veer at Monash Biomedical Imaging, who agreed to participate in the penultimate step of this most excellent Quest.And so it came to be that the Claw was imaged at Monash Biomedical Imaging, then converted from a list of 2D DICOM files to OME-NGFF (aka OME-ZARR) files. This record has the original DICOM files, and two OME-NGFF files with different chunks — one set optimised for 2D slice viewing, and another as a single chunk for 3D viewing.Suggested napari viewing parameters:3D canvasattenuated_mip renderingattenuation: 0.015References:https://twitter.com/SciPyConf/status/1679619665788534784https://bird.makeup/users/scipyconf/statuses/1679619665788534784https://fosstodon.org/@jni/110731142487885502https://fosstodon.org/@jni/110731156853442832
3 1 1

📢 Anita Sarma, our second keynote of the day, shared her experience about mentoring strategies for an inclusive community, during our diversity luncheon at 🚀

Thank you for these great insights on mentoring in open source 🫶🏻

0 0 0

DrivenData data scientist Emily Dorne is speaking today at , discussing cutting-edge developments in Satellite Imagery. Join her at 3pm PST!

Really cool work on solving local problems with satellite imagery and lightweight, effective models.

buff.ly/4bAn9RS

0 0 0

cfp.scipy.org/2024/talk/LMF8QH If anyone's at go check out my colleague Emily Dorne's talk today on detecting harmful algal blooms from satellite imagery! Emily took the winning solutions from 's competition drivendata.org/competitions/14 , talked to tons of potential users in the natural resource monitoring space, and built a rock solid package for anyone to use. An amazing example of how to make impactful

Using Satellite Imagery to Identify Harmful Algal Blooms and Protect Public Health SciPy 2024

**Motivation** Inland water bodies provide a variety of critical services for both human and aquatic life, including drinking water, recreational and economic opportunities, and marine habitats. A significant challenge water quality managers face is the formation of harmful algal blooms (HABs). One of the major types of HABs is cyanobacteria. HABs produce toxins that are poisonous to humans and their pets, and threaten marine ecosystems by blocking sunlight and oxygen. While there are established methods for using satellite imagery to detect cyanobacteria in larger water bodies like oceans, detection in small inland lakes and reservoirs remains a challenge. Machine learning is particularly well-suited to this task because indicators of cyanobacteria are visible from free, routinely collected data sources. Whereas manual water sampling is time and resource intensive, machine learning models can generate estimates in seconds. This allows water managers to prioritize where water sampling will be most beneficial, and can provide a birds-eye view of water conditions across the state. **Methods** [CyFi](https://cyfi.drivendata.org/) (Cyanobacteria Finder) is an open-source Python [package](https://github.com/drivendataorg/cyfi) that uses satellite imagery and machine learning to detect cyanobacteria levels, one type of HAB. CyFi helps decision makers protect the public by flagging the highest-risk areas in lakes, reservoirs, and rivers quickly and easily. CyFi was born out of the [Tick Tick Bloom](https://www.drivendata.org/competitions/143/tick-tick-bloom/) machine learning competition, hosted by DrivenData. The model in CyFi is based on the winning solutions, and has been optimized for generalizability and efficiency. CyFi uses two main [data sources](https://cyfi.drivendata.org/#data-sources): Sentinel-2 satellite imagery and a land cover gridded map. Cyanobacteria estimates are generated by a LightGBM model, a gradient-boosted decision tree algorithm. The model was trained and evaluated using nearly 13,000 "in situ" labels collected manually by [organizations across the U.S.](https://www.drivendata.org/competitions/143/tick-tick-bloom/page/651/#about-the-project-team). To build intuition around model predictions and error cases, CyFi comes with a [visualization tool](https://cyfi.drivendata.org/explorer/), which lets users view the base satellite imagery tile corresponding to each sample point prediction. **Results** CyFi uses high-resolution Sentinel-2 satellite imagery (10-30m) to focus on smaller water bodies with rapidly changing blooms. Sentinel-3 is used by most existing tools, but its resolution of 300-500m is often too coarse for small, inland water bodies. We find that CyFi performs at least as well as Sentinel-3 based tools and has 10 times more coverage of lakes across the U.S. CyFi is most accurate at low and high cyanobacteria densities and is intended to plug into human-in-the-loop workflows. Where blooms are likely absent, water quality managers can better allocate ground sampling resources by deprioritizing these water bodies. Where severe blooms are likely present, water quality managers can flag these for public health interventions, such as swimming or drinking water advisories. **Previous speaking experience** Emily Dorne is a lead data scientist at [DrivenData](https://drivendata.co/) and is the technical lead for CyFi. She has previously given talks at the [Women in Data Science (WiDS) Global](https://www.youtube.com/watch?v=TIzZD-XJeSo) conference, WiDS Puget Sound, and [CamTrapAI](https://camtrapai.github.io/indexold.html). In addition, she has led in-person data ethics workshops using the open-source python package [Deon](https://deon.drivendata.org/), of which she is an author.
1 2 0

Found not only someone who's also responsible for an OMERO deployment, but who grew up in the tiny English town I lived in for 6 years. is a small world.

2 0 0

Amazing posters at the poster session at 🥳

1 0 0