10 Questions: ChatGPT for New York State GIS

I thought I was going to get out in front of all the ChatGPT for GIS discussion for the New York State GIS community, but that turned out to be off by a magnitude of sorts.  YouTube is already replete with ChatGPT for GIS tutorials and if we wait until next week there will probably be another ChatGPT alternative.  (I found the Matt Forrest YouTube video a good ChatGPT for GIS intro – actually formed a question after one of his examples).  Even Amazon jumped into the frey recently with their Amazon Bedrock option.  Take your pick, Bing Chat, Chatsonic, YouChat, Google Bard (available via a waiting list) and a whole host of others on the market in 2023. 

No, the robots aren’t taking over and I suspect we won’t see any of the early Terminator Cyberdyne Systems androids applying for your local GIS Analyst position anytime soon – though it might be a solution for those long dreadful digitizing jobs in the cart lab.  One way or another, Chatbot AI technology is definitely here to stay, evolve, and become another piece of the geospatial toolbox.  

Nothing specific or implied in any the questions.  Just random chains of thought though it does take a little getting used to putting the questions together vs. a basic Google search.  And it does seem to help to refresh and ask the same question twice – or even three times – to get more detailed results.  Never was a programmer, so a little short on questions in that space.

A lot here.  Actually 17 questions in all.  Straight copy and paste. Skim through as you want – find a topic/question you might be interested in.  There are tons more which could have been included.

Without any further adieu, my first machine interview: 

eSpatiallyNewYork:  What freshwater fish datasets are best for mapping fish population based on water depth in the Great Lakes?

ChatGPT:  There are several freshwater fish datasets that can be useful for mapping fish populations based on water depth in the Great Lakes. Here are a few options:

  1. Great Lakes Acoustic Telemetry Observation System (GLATOS) – This dataset provide information on the movements and behaviors of tagged fish in the Great Lakes.  It includes data on depth, temperature, and location, which can be used to create maps of fish populations based on water depth.
  2. Great Lakes Fishery Commission (GLFC) – The GLFC maintains a database of fish populations in the Great Lakes.  This dataset includes information on fish specifies, size, and location, which can be used to map fish population based on water depth.
  3. Great Lakes Environmental Database (GLENDA) – GLENDA is a comprehensive database of environmental data for the Great Lake region.  It includes information on water depth,  temperature, and environmental factors that can be used to map fish populations.
  4. USGS Great Lakes Science Center – The USGS Great Lakes Science Center conducts research on fish populations in the Great Lake and maintains a database of fish surveys.  This dataset includes information on fish species, abundance, and location which can be used to map fish populations based on water depth.

Ultimately, the best freshwater fish dataset for mapping fish populations based on water depth in the Great Lakes will depend on the specific research question and the type of analysis being conducted.

eSpatiallyNewYork:  To what degree of accuracy can I map features from New York State orthophotography?

ChatGPT:  The degree of accuracy with which you can map features from New York State orthophotography depends on various factors such as the resolution of the imagery, the accuracy of the ground control points used for georeferencing, and the skill and expertise of the analyst.

New York State orthophotography typically has a resolution ranging from 0.5 to 1 meter, which means that individual features such as buildings, roads, and other infrastructure can be identified and mapped with a reasonable level of accuracy. However, mapping smaller features such as individual trees, shrubs, or small structures may be more challenging, as these features may not be visible at this resolution.

The accuracy of mapping also depends on the quality of the georeferencing. The accuracy of the ground control points used for georeferencing should be of a similar or higher resolution than the orthophotography itself. This ensures that the imagery is correctly aligned with the ground, allowing for accurate measurements and mapping.

The skill and expertise of the analyst are also critical factors in the accuracy of mapping features from orthophotography. A trained analyst with experience in image interpretation and feature extraction techniques can accurately identify and map features with a high degree of accuracy.

Overall, mapping features from New York State orthophotography can provide a reasonable degree of accuracy for larger features such as buildings, roads, and other infrastructure. However, mapping smaller features may require higher resolution imagery or other sources of data. It is always important to validate and verify the accuracy of the mapping results through ground truthing and other quality control measures.

Continue reading

Streamlining the New York City Environmental Quality (CEQR) Review Application with Geospatial Tools

Open source content and tools at the core of automating complex process

City Environmental Quality Review, or CEQR, is the process by which New York City agencies determine what effect, if any, a discretionary action they approve may have upon the environment. CEQR is a disclosure process and not an approval process in and of itself. Completion of an environmental review supports other decisions made by agencies such as approvals of rezoning or variance applications, funding, or issuance of discretionary permits. Ministerial actions, such as the issuance of a building permit, are not subject to environmental review.

Historically, CEQR, along with other government environmental review programs such as the New York State Environmental Quality Review Act (SEQRA) and the National Environmental Protection Act (NEPA) have been the subject of much debate – right or wrong – with regard to being overwhelming, complicated, and costly to those individuals and/or organizations involved in projects or “actions” which trigger the application process.

CEQR is precursor to ULURP (Uniform Land Use Review Procedure), which, in part, is the approval process that decides the fate of the action.  ULURP cannot start until the environmental review process is complete.

Introducing AutoCEQR

In the New York CEQR space, leave it to a couple seasoned GIS folks to step in and combine  professional experience with geospatial tools and programming skills to offer a cost effective and streamlined process to work through the CEQR application.

AutoCEQR cofounder Matt Sloane has worked in the planning field since 2007, working extensively with SEQRA and CEQR.  Over that time Matt developed specialties in both GIS and Data Science.  As Matt learned to program the tools that power ESRI ArcDesktop software, he realized that many of the processes required by CEQR, which are explicitly prescribed by the CEQR Technical Manual, could be automated based on existing data (e.g., MapPLUTO) and several project-specific inputs. He approached Danny Sheehan, a close friend and former classmate at SUNY Geneseo’s planning and geography courses, about the project. Both agreed it would be a great opportunity to put their combined skills to work and build a platform to augment the CEQR application process.  Danny was able to bring geospatial development expertise and software production knowledge he learned at UBS, Carto, and Columbia University to start and evolve the project into a production application.

AutoCEQR leverages a mixture of City, State, and Federal data resources, though primarily relies on NYC Open Data.  Other data sources include:

This 400’ radius buffer around a subject property which requires CEQR shows adjacent parcel land use classifications that are included in the NYC MapPluto file on a regular basis

A. Coding and Software Environments

Python is at the core of the AutoCEQR technology.  For working with data, the AutoCEQR team uses  Pandas, GeoPandas, Shapely, Fiona and ArcPy for generating Map Document files (.mxd’s), and creating custom Python classes for the workloads.  Sheehan notes “With GeoPandas and Shapely it’s phenomenal how close to parity they now are for matching ArcPy functionality.”  In the development environment, PyCharm Community Edition and GitHub are used for code development & versioning.   

AutoCEQR prototyping started with ArcPy for all tasks but it was decided to abstract the high-level functions so the geoprocessing engine could be changed to GeoPandas, the geoprocessing library of choice.  For interacting and communicating with Amazon Web Services (AWS) – the current AutoCEQR Cloud Computing Platform – developers leveraged Boto3 (AWS SDK for Python).  EC2 and S3 is leveraged in the AWS environment for computing, data storage, and distribution which has enabled to keep the application computing bill fairly low per month. In the future, it is anticipated to modify the architecture by leveraging more serverless technology and more scalable architecture for added compute cost savings.   AWS generously provided AutoCEQR with free computing credits for one year through AWS Activate – which was brought to their attention as part of their involvement and experience at the Columbia Startup Lab (CSL).  QGIS is also used to verify results and quick GIS work. 

Interacting with Census data and a whole host of services is made possible by leveraging the many great open-source libraries available on PyPl and GitHub. The storefront is the Squarespace AP which is used to process and deliver orders.

AutoCEQR still uses ArcPy mapping for generating maps, .mxd’s, and map packages but given the high cost of licensing and the technical slowdown it adds to both the production application and ongoing development speed, and it’s unclear if .mxd’s will exist in future iterations. (Both Sheehan and Sloane would like to have more feedback from users if the .mxd deliverable is necessary or if the application should generate static maps with Matplotlib and GeoPandas or if interactive web maps would be more helpful.)

The data engineering ETL process mostly consists of pulling down data with requests, unzipping files, some transformations and projecting data, and API libraries and a scheduler. We download the latest data every night – whether the source is updated daily or not. Data ETL would be a big focus to redesign to improve the platform and save on cloud storage and computing costs.

In addition to being consistent with existing property zoning classifications, projects are also reviewed in context of proximity to a myriad of other special districts and overlay zones.

B.  Application Process

Users input relevant project-specific information (e.g., dwelling units, building height, square footage, etc.) via the AutoCEQR website.  From there the application software ingests  the data and checks it against public data sources – usually with some intermediate geoprocessing steps required – and then references the analysis thresholds stated in the Environmental Assessment Form (EAS) to determine which analysis the proposed project is required to undertake as part of the CEQR environmental review. For certain quantitative calculations,  AutoCEQR has translated all of that logic into functions or classes in the codebase. Users also receive the data and maps for either a CEQR Pre-Screen or a select set of CEQR Full Analysis items. This VIMEO video provides an introduction to accessing the application and illustrates the products generated.

C.  Usage

To date, AutoCEQR has had several dozen environmental professionals targeted from a few key firms to evaluate application and then go on to use AutoCEQR in production. Currently Sheeran and Sloane are allowing users to leverage AutoCEQR freely in order to get helpful product feedback and gain traction.  With the aim of soliciting feedback for refinement, feature expansion, and product evolution,  AutoCEQR has been well received by former director of the NYCDCP Environmental Assessment Review Division, Ms. Olga Abinader.  She comments:

“AutoCEQR is an excellent application – as its title indicates, it automates tedious, time-consuming CEQR documentation that has historically taken consultants dozens of person-hours to complete.  As a longtime NYC environmental review expert and former public service leader, I appreciate that it gathers data points from the City’s publicly available databases and agency websites (MapPLUTO, NYC Parks, NYC LPC, GIS sources), and combines this information with user inputs (i.e., analysis framework details) to generate useful EAS Maps, visuals, and content/data for the EAS Forms in a short turnaround. Given the time savings it offers, I am very enthusiastic about AutoCEQR as a tool and recommend it highly to consultants, public service professionals, the general public, decision-makers and others interested in preparing or reviewing CEQR materials.” 

As the product is currently operating under a freemium model, users don’t need to currently apply the discount.  However, it is important for AutoCEQR to continue this offering to support affordable housing in NYC in the event AutoCEQR ever moves to any kind of fee-based model. 

All AutoCEQR maps included in the project delivery file as both ArcGIS Map Document files (.mxd) and Map Package files (.mpk).

D.  Affordable Housing Development Services Discount

Those working on the development of Affordable Housing or Inclusionary Housing are encouraged to contact the AutoCEQR team.  It is their aim is to provide the AutoCEQR platform and reporting deeply discounted for individuals or companies involved in these types of housing projects.  If the entire development provides 100% Affordable units, the AutoCEQR team intends to provide free reporting and analysis.*

As the product is currently operating under a freemium model, users don’t need to currently apply the discount.  However, it is important for AutoCEQR to continue this offering to support affordable housing in NYC in the event AutoCEQR ever moves to any kind of fee-based model. 

* Free reporting with minimal overhead for costs associated with report processing. 

Summary 

Development and marketing efforts on the AutoCEQR project has slowed down since both Sheehan and Sloane have started new full-time positions.  Nonetheless, both continue to explore interesting options for its future development and continued success.  Individuals and companies interested in the application and/or communicating with Sheehan and Sloane are encouraged to do so via the contact information below.

Contact:

Daniel M. Sheehan
danny@autoceqr.com

Matt Sloane
matt@autoceqr.com

Empire State GIS/Mapping DIYer Phenom: Andy Arthur

Self-taught hobbyist has a treasure chest of geospatial content on website

One of the benefits of writing about all-things geospatial in Empire State is sometimes I just don’t know what I’ll come across.  Looking for this thing and finding that.   Starting in earnest on an article about a certain GIS channel and a couple days later finding myself having completely jumped the rails and find myself way over there writing about Channel Z.  (Yup, that static in the attic).  Or ending up on a cool or fun website not really knowing how I got there.

Case in point:  Interactive Maps by Andy Arthur.  Empire State mapping DIYer extraordinaire.  Just a hobby.

When I first stumbled onto the site and having spent some time driving around, I realized I needed to find out who was behind it all.  It definitely isn’t the kind and feel of the traditional geospatial website I normally include or reference in my blog, but enough interesting – and yes, quite different – content to dig a little deeper.  And glad I did.  This is not a blog post to focus on a particular topic or concept, but rather just more of a pointer to the URL and let you take away from the website what you want.  

It turns out the person behind all of this is Andy Arthur, who by day, is Deputy Director of Research Services in the NYS Assembly.   “I have no formal GIS training, as things were still pretty primitive back when I was in college (SUNY Plattsburgh)  in the early 2000s especially when it came to web services, online data and open source software” says Arthur, “computers were a lot less powerful back then. I remember vaguely hearing a bit about Remote Sensing when I was involved in the Environmental Science Club in college, but it wasn’t something I ever used.

Since then and working on his own, Arthur picked up QGIS (and the accompanying PyQGIS developer tools), as he was looking for a way to make his own topographic maps because he wasn’t happy with what was available on  the Internet. He later found out he could FOIL a primative campsite shapefile from NYS DEC and get data from there to help find campsites. “I was pretty good at map and compass stuff from my years in Boy Scouts and always interested in environmental and land use issues”, he says.  Over time, he branched out into other geospatial areas including web services.  More recently He’s been focusing on more automation of processes, using Python and R statistical language to do some map plotting and a lot of Census data gathering and processing. “I like working with R as it is fast and easy to implement code in. I’ve also lately been doing a lot more with Leaflet and web services”.  Along the way he continues to use GeoPandas and Leaflet for map making. (btw as I was putting this blog piece together I found out the creator of Leaflet 11 yeas ago was  Volodymyr Agafonkin, a Ukrainian citizen who at the time was living in Kyiv.)  Content on the site is also made available in KMZ for use in Google Earth.

This is a example of how Arthur processed LIDAR data covering the Rome Sand Dunes west of the City of Rome in Oneida County. The landscape ifs a mosaic of sand dunes rising about 50 feet above low peat bogs which lie between the dunes. Processed LIDAR data renders the dunes very clearly. Arthur created this originally by writing a QGIS plugin that queries a shapefile with the LIDAR Digital Terrain Model Bare-Earthindex, then downloads the geotiffs, and finally joining them together to create the hillshade.The plugin itself is in Python and runs in QGIS, while the lidar download/processing script is in php-cli shell script.

The best place to start navigating the website is to open the table of contents link located in the upper right corner of the landing page. The table of contents page then provides additional links products and visuals Andy has created including aerial photos, charts, interactive maps (recommend starting here), and thematic maps to name just a few.   This page also provides more detail on open source components, some specifics on the use of Python and Pandas, a downloadable CSV file listing of web services (WMS, ArcGIS services, etc) used on the blog, and much more.  It’s worth noting that the website also includes non-GIS/geospatial content.

If you need some additional evidence of how much Arther has picked-up on programming, using open source components, and navigating the geospatial landscape in this space,  check out his tutorial on how to create a  Digital Surface Model GeoTIFF Using National Map Downloader, LiDAR Point Clouds and PDAL.  By example, the DSM image above is from a section of the Albany Pine Bush.  For a larger montage of the Albany Pine Bush digital surface model and samples of his code, click here for downloads.

And of course, the old stand-by hardcopy product. Here, a recently created thematic map of the City of Albany median year of housing construction map. He used the NYS Tax Parcel Centroid Points data aggregated down to the parcel level using R code and created a GeoPackage. Which was then used to create the map in QGIS. Additional layers were added for context.

There are many many more examples of geospatial products, maps, and viewers on the website.  Its a great example of how much can come out of the other end when diving into and applying geospatial tools to one’s own personal interests and way of living. 

When you have a few minutes over lunch or a cup of coffee, take a look at his site.  In communicating with Andy over the course of putting this piece together, he would be open to talking with and assisting non-profit or similar community groups on specific GIS/mapping projects.  His contact information is below. 

Contact:

Andy Arthur
www.andyarthur.org
andy@andyarthur.org

Hub2Hub: Sharing and Promoting the Geospatial Message

Publishing spatial data content has proven to be great outreach in terms of providing transparency and exposure for many government GIS programs – including here at Westchester County.  Increasingly much easier for geospatial organizations to do courtesy of powerful server technology allowing these same agencies to easily spin together data services based on common thematic features such as environmental, planimetric, business, demographics, cultural, parcels, and aerial photography to name only a few.  Though the data service environment is better understood and leveraged by a specific user community knowing what type of spatial content to look for, there remains even a larger community of geospatial users which navigate data portals to download just individual data sets.  Looking for that individual shapefile, KML, or .dwg file for their own individual project.

And even with all of this good geospatial content available, many publishing organizations – particularly government – struggle with building educational outreach programs for the larger GIS user community.  Tailored programs for users in community and advocacy groups, nonprofits, and even the business sector instructing how to effectively use and leverage our data.  Government GIS programs often do not have the resources or the business model to deliver these types of services outside of their organization.

Enter the emerging roll of “service” Hubs which specialize in leveraging government open data  in ways government programs normally cannot offer. Some are nonprofit in nature or affiliated with academic institutions.  Others, like OpenHub based in the lower Hudson Valley, is a for-profit venture.  The common theme among them is a window offering a wide range of engaging programs increasingly based on government data sets featuring hackathons, workshops and training programs,  application tools, and offering  business tech “round table-type” discussions.  While open source software such as OpenStreetMap, QGIS, Python, and R are commonly used to support their missions (and thus, the easy and affordable entry point for new users), one can see the use of ArcGIS Online and CARTO being offered as well.  Effectively expanding the use and understanding of geospatial concepts and bringing technology to the people.

Third Party “Hubs”

I’ve referenced and written before on the great work of BetaNYC serving in this space.  Particularly with regard to extending content of NYC OpenData  through their numerous programs including the flagship NYC School of Data conference.  While New York City does an incredible job in promoting and maintaining city government’s open data catalog itself, BetaNYC extends the effort by building an entire framework of putting the data into the hands of residents on the streets.

This BetaNYC viewing applications enables user to navigate the numerous political, administrative, and operational districts across New York City

Early in this space in the metro NYC area was the work of Steve Romalewski who started providing mapping services to the nonprofit sector as part of the Community Mapping Assistance Project (CMAP) which at the time was affiliated with New York Public Interest Research Group (NYPIRG).  Since 2006 he has been affiliated with the CUNY Mapping Service as part of the Center for Urban Research, Graduate Center at City University of New York (CUNY).

The Long Island Index was an early and very visible project that the CUNY Mapping helped create.  Another effort affiliated with the Center is the NYC Labor Market Information Service uses federal business and labor data sets.  Be sure to check out their Career Maps infographics and reports.  Similar academic efforts in the city exist at the Spatial Analysis and Visualization Initiative (SAVI) at Pratt Institute and the Center for Spatial Research at Columbia University.

Though originally developed at CMAP, the CUNY Mapping Service now hosts OASIS which is one of the most comprehensive mapping and viewing application covering the New York City footprint. Dozens of government generated datasets are made available in the viewer.

While the OpenHub is only beginning to jump into the geospatial space, its Founder Yulia Ovchinnikova will be highlighting the Westchester County GeoHub as part of the 2021 HV TechFest Conference as an example of open government and open-source data sharing to promote building communities. OpenHub will also be using content from the county’s GeoHub as part of a course project during a current online Data Analytics Bootcamp.  As regional technology and economic development is an important focus to OpenHub’s mission and purpose, our discussion started by focusing on a new web map being published which highlights the emerging BioSciences Ecosystem community in Westchester County.  The County is home to the largest biosciences cluster in New York State, boasting 8,000 jobs and 20 percent of the State’s total biosciences employment. The cluster comprises academic institutes doing basic research, R&D and clinical stage startups, large manufacturers and supply chain participants. OpenHub offers a mixture of both free and fee-based online programs including coding clubs, meetups, programming, and specific events for small businesses and startups. OpenHub is an open ecosystem resource center committed to building a tech-sector in the Hudson Valley and beyond

Government open data and economic development. BioScience business location X,Ys are available for download via ArcGIS Online.

MANY organizations leverage government generated geospatial data to support their own business needs and/or to champion a host of civic needs and issues.  In just one area, such as environmental justice (EJ), a Google search located a document published by the SeaGrant program at SUNY Stony Brook which identifies the numerous individual agencies (largely government and nonprofit) engaged in providing “Environmental Justice Mapping Tools for New York State Communities”.   Child care, social services, public safety, public transportation, sustainability, climate change, and civic issues only scratch the surface of the spaces where advocacy groups are involved in using government geospatial content for the larger public good.

Summary

What differentiates the groups itemized above are that they are more full service, a la carte organizations offering a wider range of products and services.  Staffing and resources to turn government data content into meaningful deliverables such as workshops (online and in-person), hackathon-type events, community events,  training courses, and even bringing the data to civic leaders and administrators in ways which is simply not possible by government publishing agencies.  Such outreach is often very targeted making the data more meaningful and ending up in the hands of those who understand its content the most.  Viewing and using the data at the “micro” as opposed to the “macro” or a larger geographic footprint.  Local is better.

While technically not all “hubs” in name, these groups continue to illustrate the growing benefit of how organizations are adding value and exposure to the government data assets.  In many scenarios, building partnerships with these types of service organizations can be of great value to government publishing agencies.

 

SPEED 2.0: Authoritative Environmental Remediation Mapping in New York City

Application Includes the use of both Open Source Software and Open Data Content

A lot of great geospatial projects and content are coming out of the NYC OpenData ecosystem.  In the same space  throughout the city is the deployment of applications and viewers using open source software.  One such app is the Searchable Property Environmental E-Database SPEED 2.0, built on top of CARTO and published by the Mayor’s Office of Environmental Remediation (OER).  I was introduced to the application via an online presentation organized by GISMO in March of this year.

SPEED 2.0 is an impressive collection of local/city, state, and federal geospatial datasets wrapped into one application for the purpose of helping individuals identify environmental issues – both current and past – on and/or adjacent to specific properties in New York City.  It is a sister application to the NYC Office of Environmental Remediation’s Environmental Project Information Center (EPIC) that provides information about the cleanup of brownfield sites across the city.

Individual parcels can be buffered by either 250’ or 500’ to show the proximity of adjacent parcels with current or past environmental issues, permitting, or contamination issues. Access to pertinent metadata is readily available.

Background

According to Lee Ilan, Chief of Planning in the Mayor’s Office of Environmental Remediation, the first version of SPEED was launched in 2009 as a web map with limited functionality and developed with PostGIS.  It was launched in support of the newly created office’s focus on the cleanup of brownfields across the city.  However, support for the initial application waned over the next several years with minimal new content added.  Post – SuperStorm Sandy provided new funding through the U.S. Housing and Urban Development (HUD) Community Development Block Grant Disaster Recovery Program which OER secured and offered the opportunity for a major rewrite and update of the original application. SPEED 2.0 was designed by their vendor Applied Geographics (AppGeo) to be a cloud-based application.  Originally the application was managed by the vendor but since December 2020, OER has assumed managing the app in the Google Cloud on their own.

The application also includes advanced search functions. For example, in the left-hand column using the filter options, I was able to identify only those OER projects in FEMA 100-year floodplains. Query is rendered in the map viewer.

Carto software is helpful by providing a very modern user interface that generates layers which are compatible with Leaflet”, notes OER’s IT Director Maksim Kleban.  “It makes the transition from uploading our layers, and turning them into fully functional, interactive maps seamless.”  AppGeo proposed the use of CARTO to OER which has since found the software to be user friendly and simple to use with standalone online applications. Carto is licensed annually for the amount of space and resources needed for the SPEED application and works very similar to any other cloud solution, like Amazon Web Services, Microsoft Azure (AWS), or Google Cloud.

Currently there are about 50-55 datasets included in the SPEED viewer right now. The large majority are OER datasets which are updated automatically by syncing with data from external agencies’ datasets on Open Data, or from OER’s internal data sources.  Generally, they each have an independent update schedule which is also automated.   The data is managed mostly by automatic updates on OER’s server which communicates directly to Carto through an API. For layers which are not on an automatic update schedule, OER uses either a custom-designed interface or manually uploads data into Carto’s online platform.

User can search the SPEED database using a standardized address, common place names such as Bryant Park or Madison Square Garden (btw – even “MSG”!) or borough, block and lot (BBL) numbers.  The application also includes mark-up, feature transparency, and sharing tools,  great HELP documentation and easy access to metadata (as illustrated in the first image above) which is very helpful given the bevy of similar datasets from local, state and federal datasets accessible in the app.  Historical aerial photography from 1996, 1951, and 1924 enables users to identify previous land cover which can be an indicator of the presence of historic fill.  A “Sensitive Receptors layer includes the locations of facilities (schools, parks, libraries, health care, etc) where occupants are more susceptible to the effects of environmental contamination.

It continues to be a work in progress” says Ilan, “in the future we would like to also have functionalities for registered users. We also would like to add more analysis capabilities where new layers can be easily integrated with advanced search features”. 

SPEED 2.0 Featured on NYC Open Data Week

For the first time ever, OER participated in NYC Open Data Week in early March.  For those looking for a deeper dive into SPEED 2.0, use the link below to listen to Lee’s presentation.

Contact:

Ms. Lee Ilan
Chief of Planning
NYC Mayor’s Office of Environmental Remediation
lilan@cityhall.nyc.gov

Focus on Hudson River Flooding Impacts

Online Application Provides Tools and Data for Communities at Risk Along the Hudson River

Background

The Hudson River Flood Impact Decision Support System (HRFIDSS) is an online mapping application as well as a collection of geospatial products and data which illustrate the scale of potential flooding for tidally-affected shorelines of the Hudson River Valley and Westchester County coastal communities. In all, the viewer covers the shoreline communities in 11 counties in the region.  Developed by researchers from Center for International Earth Science Information Network (CIESIN), Stevens Institute of Technology, and the Columbia Water Center, the application enables individual residents, community groups as well as municipal and regional planners to evaluate risk based on potential outcomes, and to determine the impact on and effect of social vulnerability, natural resilience, and critical infrastructure.

The tool also provides downloadable estimates of flooding impacts, including affected population and critical infrastructure, and cost estimates of building damage.  It is further enhanced by including hydrological models which expands the flooding impacts to include storm tides/storm intensity and the contributions of river tributaries and streams.

The Hudson River Flood Impact Decision Support System helps address the growing interest across all levels of government in the New York region towards developing better forecasting capabilities and a better understanding of flood probabilities under a variety of sea level rise and storm frequency scenarios.

The application was developed using OpenLayers.  Critical infrastructure data is housed in a relational PostGIS database which are served as a Web Feature Service (WFS) using Geoserver.  Flood layers are raster data published from an ArcGIS Server at CIESIN while supporting imagery layers are consumed from New York State.

Impact Assessments

In addition to a variety of unique mapping tools, one of key features of the application is the generation of “Impact Assessments”.  These are generated by application flood maps using a combination of historical and synthetic flood events which include water depth, surge, rainfall, and  sea level rise scenarios of 0″, 6″, 12″, 18″, 24″, 30″, 36″, 48″, 60″, 72″ as well as 5-year through 1000-year flood zones.  Flood scenarios presented in the mapping tool are accompanied by a set of impact estimates at the local level. These are divided into three themes: critical infrastructure, social vulnerability, and natural resilience features.
Continue reading

Green Map 2020: Evolving with Technology and Staying Community Focused

For nearly 30 years, Green Map has been an anchor in the Metro NYC nonprofit mapping community.  Led by its original founder, Wendy Brawer, Green Map’s portfolio of work and advocacy in locally-led mapping and sustainability efforts is quite impressive.  Since the launch of the Green Map System by Wendy Brawer in 1995 and publication of the original Green Map of NYC through her eco-design company, the program’s engagement tools and mapping resources have been adapted by locally-led projects in 65 countries.   Many joined the Green Map network when Greenhouse – their first content-managed website – was activated in 2007.   The site grew to become a rich resource highlighting community developed mapping work in all parts of the world.

While “green” mapping was one of its early defining products and focus,  the organization has expanded its services to include interactive web mapping, education, multimedia, event coordination, workshops, tours, planning and consulting, and creating software development tools.

 

Community Engagement Maps and Tools

Green Map’s award-winning icons, tools, platform(s) and tutorials help create engaging and visually powerful mapping content used as guides for local nature, culture, community living, social justice and resources for sustainable living. In addition to providing tools and tutorials on making maps, Green Map promotes their mapping efforts as part of Project Stories.  Project Stories are locally written and use tags to categorize the story type, theme, and purpose. Users can  discover how Green Map projects were created, people and organizations involved and about the project’s impact on their community.   Green Map projects are tagged and can be searched as part of the following 11 categories:

Besides its own mapping platform (with version 2 of the Open Green Map now in beta), the program also offers suggestions and tutorials on how to promote the use of the Green Map icons in other interactive mapping platforms such as ArcGIS Online (Story Maps), CARTO, Google Maps, Google Earth, and the Map of Tomorrow.

Continue reading

Game Time: Mapping Moose in New York State

Introduction

A rare treat in eastern upstate New York, mostly in the Adirondacks, although sometimes in the Taconic Highlands along the Massachusetts and Vermont borders and within the Catskills, is to see moose in the wild.  Moose (Alces alces) are the largest member of the deer family (Cervidae) and the largest land mammal in New York State. Having been absent from the Empire State since the 1860s, the species began to reenter the state on a continuous basis in the 1980s.   While re-establishment of the moose population in New York has been viewed and supported as a positive sign of a healthier, more complete natural ecosystem,  it does not come without a range of potential problems associated with their return and the need for proactive management and monitoring by New York State Department of Environmental Conservation (DEC) wildlife biologists and researchers.

So in 2020 it should come as no surprise that wildlife biologists in New York State – and around the world – are using geospatial technologies to support their work.  In addition to estimating moose population size, New York’s wildlife biologists and scientists also focus on improving moose survival and reproductive rates and to assess their diet and health.  Assisting the DEC’s work towards researching moose habitat and resource requirements is David W. Kramer, Research Scientist, in the Division of Fish and Wildlife.  The toolkit which augments his moose research includes a mixture of GPS, remote sensing imagery, a variety of GIS data layers, and  both ESRI and “R” software.  R being a free, open source software package for statistical computing and graphics commonly used in the research community.

Population Counts and Observations

To date, Mr. Kramer and colleagues have been observing the locations of 26 moose which were collared with either a  Lotek or Telonics GPS unit which can store data “on board” the collar that can then be retrieved by getting the collar back or by getting close enough to the moose to download the data to a receiver. “Uplink” collars can store data on the collar as a backup, but also send daily data uploads via satellite and are then stored online.  Moose are captured (to put the collar on) by a crew which “net-gun” the animal from a helicopter. For their research, DEC staff focuses on female moose (cows) for two reasons:  (1) wanting to track of how many offspring each cow has; the collars facilitate the “following” of the females in the summer to count calves, and (2) male moose (bulls) go through physiological and body changes during the same period that do not make the collaring of males practical or even unsafe.  Data associated with the collars are important in analyzing survival and calving success as well as the geography associated with habitat selection.

Continue reading

The Definitive Map for New York State Media Arts

New Online Application Shows Locations and Services for Organizations and Artists Across the State

Image from Heidi Neilson’s Southern Florida Beachball Antenna Archive (2017)

While the Empire State geospatial community is accustomed to seeing terms such as Google Maps JavaScript API, GeoJSON, and HTML as part of the conversation in building online mapping applications for government and business, we increasingly see the use of these same online mapping development tools in the nonprofit sector as well.  One such example is the newly updated New York Media Arts Map which is a project of of the New York State Council on the Arts (NYSCA) Electronic Media and Film program, managed by Wave Farm.  Though Wave Farm is more commonly recognized as an arts organization dedicated to transmission arts with FM and online radio stations, it is their regrant partnership through NYSCA which ultimately led to its stewardship of the online Media Arts Map. Serving in this capacity, Wave Farm assists in administering and distributing grant funds, ranging from $500 to $10,000, to dozens of New York State individual artists and organizations – all of which are included in the Map.

The Map

At its core, the Map is an extensive online catalogue of Media Artists, whose interdisciplinary work includes a wide range of fields including visual art, film, computer graphics, data visualization, sculpture, architecture, music, photography – even drones – and much more. The Map lists organizations which are recent grantees of the NYSCA Electronic Media and Film program (EMF) and the Media Arts Assistance Fund (MAAF).  In all, there are 198 organizations and 69 artists included in the Map.  First launched in 2009 and originally hosted by Rhizome, the 2014 update is the work of Peter Berry who is a developer affiliated with Supercosm, a Seattle, WA based firm which focuses on work with nonprofits.  In addition to the software components already mentioned, Peter’s work on the back end of the map includes the use of open source pieces CakePHP and Apache bundled with MySQL.

In addition to providing location information on organizations and artists included on the map, pull down menus along the top tool bar include additional statewide media arts information on meetings and workshops, partnership reports, and information on funding opportunities.

The Search box offers a wide range of options to filter the organization/artist database. Search queries provide a description of the types of work, services, and types of media arts offered as well as online contact information.

Intersection of Media Arts and Technology

It’s interesting to see, albeit even on some small level, how  the media arts world finds itself close to the geospatial space.  For example, take the work of  Harvestworks which supports the creation of art works achieved through the use of new and evolving technologies and provides an environment for experimentation with project consultants, technicians, instructors and innovative practitioners in all branches of the electronic arts.  A recent report by the organization entitled Common Ground provides “an overview of how and where transdisciplinary work is taking place in New York State, and makes recommendations to encourage further opportunities for artists that, when combined with environmental research, can aid engage the public, encourage environmental stewardship, and benefit communities”.

The report was published by the Harvestworks Digital Media Arts Center and The New York City Urban Field Station which is a partnership of the U.S. Forest Service, New York City Department of Parks and Recreation, and the Natural Areas Conservancy. (The Field Station in NYC supports the Arts and Humanities Residency Program. Check out the Summer 2018 Moon Arrow project – a time lapse compilation of the Moon Arrow pointing to the moon on New York City shorelines).  Open data, building code and hackathons in the geospatial space of your interest?  Then take a look the work of the School for Poetic Computation.  And how about drones?  While the technology is evolving incredibly fast, there was the 2018 Art Omi, Wave Farm and eteam live performance of The Aerial Actor which focused on the rising use of drones in science and art.  Actual drones were used as both carriers of messages and actors in the performance.  “The Map is a vital resource identifying art and technology projects and activity throughout New York State supported by NYSCA Electronic Media & Film”, says Galen Joseph-Hunter, Executive Director, Wave Farm. “It is also a useful tool for showing a geographic picture of how support is distributed throughout the state.”

Summary

So stop for a minute, grab a cup of coffee and put down the pen, I mean your mouse, and take a deep drive into the New York Media Arts Map site.  See what’s in your region, maybe even your neighborhood.  Support the statewide media arts community.  Cool stuff – all from the traditional artistic settings to the avant garde and experimental.  And sometimes, just a one-off from the geospatial space we work in.

Enjoy.

Contact:

Galen Joseph-Hunter
Executive Director
Wave Farm
Radio: WGXC 90.7-FM, Transmission Arts, Grants
518-622-2598
galen@wavefarm.org

Geospatial Business Spotlight: CARTO

Company Name:                   CARTO

Location:                               New York, New York​​​​​

Website:                               www.carto.com

Employees:                          143

Established:                         2009

Founded by Javier de la Torre, CARTO is a diverse and expanding company which includes data scientists, geospatial analysts, cartographers, software developers and engineers, visualization experts, and web designers focusing on Location Intelligence.  Most recently in May 2019, CARTO expanded its worldwide professional service portfolio offerings by acquiring Geographica.

Providing ready to use software tools for data scientists and application developers, CARTO’s client focus is on turning location data into business outcomes, and is built around the following workflow:

  • Data Ingestion & Management
  • Data Enrichment
  • Analysis
  • Solutions & Visualization
  • Integration

Software & Capabilities

Complex analysis, filtering, and visualization are integrated in real time reducing time-to-insight.  Users can integrate CARTO’s API’s and geocoding services to complement other apps and business applications and can be integrated with custom proprietary analytical models.  CARTO can be used as an engine to visualize a wide range of data services.

CARTO is scalable and offers a Software as a Service  (SaaS) deployment model to push new features instantly allowing users to “grow as you go.” Being enterprise-ready also means making on-premise and private clouds architecture solutions available to clients.  CARTO also offers a mobile platform.

Sample Products and Applications

On October 16, 2019, CARTO hosted the 2019 Spatial Data Science Conference (SDSC) at Columbia University which I covered and reported on in a previous blog post.  Typically GeoSpatial Business Spotlight focuses on three or four applications from the firm being highlighted.  However, since SDSC was a day-long series of entirely CARTO-based applications, the conference website provides a better and more thorough overview on how CARTO is applied in business, academia, government, and nonprofit organizations.  Choose from presentations by Uber, Facebook, University of Chicago, American Securities, Salesforce Maps, and MIT among others.  In Empire State, CARTO supports numerous programs in the metropolitan New York City area in both business and government.

Contributions to the Profession 

As part of CARTO’s long-standing commitment to FOSS, Open Source, Open Data, and Open Science, the company has collaborated with many organizations providing access to next generation geospatial technology, data, and models. Most recently (October 2019), CARTO’s Javier de la Torre  joined the Urban Computing Foundation (UCF) Technical Advisory Committee which is a neutral forum for accelerating geospatial open source and community development.  The UCF operates under the umbrella of The Linux Foundation.  In July 2019, Geospatial Media and Communications included Javier de la Torre as part of the Location Analytics & Business Intelligence (LA & BI) Advisory Board.  Additional

CARTO is an open source software built on PostGIS and PostgreSQL which was first released in Beta at FOSS4G in September 2011 and officially released at Where2.0 in April 2012.  The CARTO software solution uses JavaScript extensively in front end web applications, back end Node.js based APIs, and for client libraries.

Overall, CARTO’s platform consists of the following primary components:

The CARTO platform enables users to access and manage vast amounts of data while at the same time providing numerous processes to discover, filter, and integrate local and Big Data libraries.  Geo-enabling large datasets provides a means to visualize and better understand large and complex datasets. CARTO enriches user location data with versatile, relevant datasets, such as demographics and census, and advanced algorithms, drawn from CARTO’s own Data Observatory and offered as Data as a Service (DaaS).

CARTO uses widget-driven dashboards, an array of maps, and unified workflows so that non-GIS and non-mapping users/staff can bring the power of location into the organization’s decision making.

The CARTO software user interface provides both user-friendly mapping and dashboard visuals which can be customized to user needs and experience.

Complex analysis, filtering, and visualization are integrated in real time reducing time-to-insight.  Users can integrate CARTO’s API’s and geocoding services to complement other apps and business applications and can be integrated with custom proprietary analytical models.  CARTO can be used as an engine to visualize a wide range of data services.

CARTO is scalable and offers a Software as a Service  (SaaS) deployment model to push new features instantly allowing users to “grow as you go.” Being enterprise-ready also means making on-premise and private clouds architecture solutions available to clients.  CARTO also offers a mobile platform.

Contact:

Florence Broderick
VP Marketing
flo@carto.com
4475-686-89402