Smart Great Lakes Initiative Virtual Workshop Q & A

FacebookTwitterYouTubeInstagramLinkedIn

The following questions surfaced during the Smart Great Lakes Initiative Virtual Workshop held on April 21, 2020.

 

Q: Speaking of GLRI, is there any active coordination between the Smart Great Lakes Initiative and Healing Our Waters Coalition, and/or Great Lakes St. Lawrence Cities Initiative? 

Kelli Paige: Yes, before we released our Strategic Plan in October 2019 and officially “launched” the Smart Great Lakes Initiative, GLOS reached out to several regional organizations to introduce them to the concept and get their thoughts and feedback. A list of early supporters of the effort can be found here.

 

Q: Does making the Great Lakes “smart” mean you’re looking for solutions that can only be applied within the physical lakes themselves, or does it encompass nearby cities, their entire watersheds, etc?

Kelli Paige: Making the lakes “smart” definitely means going beyond the coastlines of the lakes farther up into the watershed. The Great Lakes can be considered “smart” when we are using the best technology available to make data/science driven decisions that help inform our management and use of the lakes. That includes any data or information needed related to coastal, upland, watersheds, cities, and general land, water, and air throughout the region. 

Beyond environmental data, Smart Great Lakes can also include information related to our region’s social, economic, and civil infrastructure as important context for understanding and managing major environmental issues. So although traditionally GLOS has focused on managing in-lake environmental data, from our perspective, we want Smart Great Lakes to address the broad spectrum of data and information needs in the region and we are adapting our IT platform to help serve as a resource for more complex and multi-disciplinary data management and analysis.

 

Q: With the data you are getting how do you plan to change people’s lives? For example, what is the main benefit to users, local populations/businesses?

Tim Kearns: We hope that the presentations made it clear that the impact to people’s lives that emerging and innovative “smart” technologies could have. Perhaps the primary and greatest benefit impacting people’s lives is by having a better understanding of the current and trending conditions of the Great Lakes. This ranges from human health (drinking water) to economic benefits (fishing industry) and recreation (beach access/safety for swimming, boating safety) or more accurate modeling of scientific data (research and weather prediction). 

 

Q: Could Seagull be integrated with a maritime transport logistics system?

Mark Fisher: Yes, the intention is that we would connect different data streams into Seagull to create larger data lakes.

Tim Kearns: Yes, Seagull could potentially be integrated with a maritime transport logistics system. For instance, real-time conditions could be piped into another system via our Seagull API (Application Programming Interface). We would need to better understand the context and application in order to better understand the technical needs and potential for integration.

 

Q: What is your approach for data standardization?

Tim Kearns: Data Standardization is a great topic. The new platform will be able to handle a wide variety of data, data types, and data structures. These range from flat files (ASCII, binary, proprietary) to streaming data (IoT, real-time, near real-time), and content accessed from other platforms using API’s (Application Programming Interfaces). Once the content makes it into the GLOS, we plan to “normalize” that data as much as possible to make it easily available to others for download, perform additional analysis, visualization and publication of information services. That process will use ETL’s (Extract, Transform, Load) where possible to convert/normalize to make it available. These types of processes are widely used in the GIS and spatial domains. 

 

Q: Do you have the partner who is providing your data management platform apply ETLs or will you ask data providers to do that, or both?

Tim Kearns: No intention to ask data providers to do the ETL (unless they can and it’s a natural part of their workflow). Plan at the moment is to have a flexible data ingest process on the platform to handle a variety of inputs and either store in native format or convert to NetCDF (as an example) or a data stream depending on period of ingest.

 

Q: What’s the relative level of interest and investment in open-water (aka blue water) vs. coastal vs. watershed data?

Tim Kearns: SGLi is interested in data all the way from the open water high into the watershed—including the social, economic, and infrastructure information that can provide context for understanding and managing environmental challenges. 

 

Q: Was wondering when you might be looking to use [H2NOW Chicago technologies] for lake monitoring. When we originally made the TECTA-PDS intro to MWRD and the City of Chicago, we originally discussed using this technology for monitoring and reporting water quality for Chicago lakeshore activities and swimming. 

Alaina Harkness: The H2NOW Chicago technologies are being tested in an urban river setting. Current is interested in evaluating the potential of these and other technologies for monitoring lake water quality, once we have completed the pilot on the river. A number of different stakeholders have expressed interest in expanding water quality monitoring to the lake. Thanks for the question! 

 

Q: What sensors are you using for phosphorus testing? Are you aware of any optical real-time phosphorus sensors or any promising R&D projects for this?

Bryan Stubbs: There is promising DNA on a chip but it is still too expensive. We think remote sensing and cheaper proximity sensors, backed up with analytics will get us there. Xylem is working on a pilot on this topic. 

 

Comment: It is very important to maintain a good network of high-quality, quality-controlled data (like high-quality sensor buoys). AoT has shown that not all of the cheaper sensors work very well for long-term deployment. We currently don’t know which data will and will not be reliable, so the new IoT sensors & data should be viewed as uncertain until they’re ground-truthed against reliable data.

Tim Kearns: Certainly not suggesting that certified, larger equipment disappear. Hoping that having more devices will augment the network.

That’s definitely the best approach. I just wanted to be sure that everyone understands that IoT sensors are great sources of supplementary data, especially to increase aerial coverage, but they shouldn’t be the only sources of data.

 

Have more questions or want to get connected?

Email Katie Rousseau, Smart Great Lakes Liaison, at katie@glos.org or use the form below:

Send Katie a message: