Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
We’ve been asked to create measures of communities that are “walkable” for several projects. While there is no standard definition of what makes a community “walkable”, and the definition of “walkability” can differ from person to person, we thought an indicator that explores the total length of available sidewalks relative to the total length of streets in a community could be a good place to start. In this blog post, we describe how we used open data from SPC and Allegheny County to create a new measure for how “walkable” a community is. We wanted to create a ratio of the length of a community’s sidewalks to the length of a community’s streets as a measure of pedestrian infrastructure. A ratio of 1 would mean that a community has an equal number of linear feet of sidewalks and streets. A ratio of about 2 would mean that a community has two linear feet of sidewalk for every linear foot of street. In other words, every street has a sidewalk on either side of it. In creating a measure of the ratio of streets to sidewalks, we had to do a little bit of data cleanup. Much of this was by trial and error, ground-truthing the data based on our personal experiences walking in different neighborhoods. Since street data was not shared as open data by many counties in our region either on PASDA or through the SPC open data portal, we limited our analysis of “walkability” to Allegheny County.
In looking at the sidewalk data table and map, we noticed that trails were included. While nice to have in the data, we wanted to exclude these two features from the ratio. We did this to avoid a situation where a community that had few sidewalks but was in the same blockgroup as a park with trails would get “credit” for being more “walkable” than it actually is according to our definition. We did this by removing all segments where “Trail” was in the “Type_Name” field.
We also used a similar tabular selection method to remove crosswalks from the sidewalk data “Type_Name”=”Crosswalk.” We kept the steps in the dataset along with the sidewalks.
In the street data obtained from Allegheny County’s GIS department, we felt like we should try to exclude limited-access highway segments from the analysis, since pedestrians are prohibited from using them, and their presence would have reduced the sidewalk/street ratio in communities where they are located. We did this by excluding street segments whose values in the “FCC” field (designating type of street) equaled “A11” or “A63.” We also removed trails from this dataset by excluding those classified as “H10.” Since documentation was sparse, we looked to see how these features were classified in the data to determine which codes to exclude.
After running the data initially, we also realized that excluding alleyways from the calculations also could improve the accuracy of our results. Some of the communities with substantial pedestrian infrastructure have alleyways, and including them would make them appear to be less-”walkable” in our indicator. We removed these from the dataset by removing records with a value of “Aly” or “Way” in the “St_Type” field. We also excluded streets where the word “Alley” appeared in the street name, or “St_Name” field.
The full methodology used for this dataset is captured in our blog post, and we have also included the sidewalk and street data used to create the ratio here as well.
Facebook
TwitterThese data display the felt effects for selected historical California earthquakes. By selecting a linked earthquake you can see a map of the area and intensity of shaking from that earthquake. Then by clicking on a city on the intensity map (or list beneath the map) you can see what that location reported after the earthquake. These “felt reports” are usually from newspapers during the 1850 to 1930 time period. The felt reports were compiled and published in 1981* and 1982** with updates added through the years. Not all earthquakes have intensity maps and felt reports online since the conversion from printed report to web page is an on-going process. Most of the San Francisco Bay area events between 1838 and 1906 are finished.More information is available in the California Geological Survey publication: Map Sheet 49 and Department of Conservation - Earthquake Catalogs.*Toppozada, T. R., C. R. Real, and D. L. Parke (1981). Preparation of isoseismal maps and summaries of reported effects for pre-1900 California earthquakes, Calif. Div. Mines Geol. Open-File Rept. 81-11 SAC, 182 pp. **Toppozada, T. R. and D. L. Parke (1982). Areas damaged by California earthquakes, 1900-1949, Calif. Div. Mines Geol. Open-File Rept. 82-17, 65 pp.
Facebook
TwitterDelta Primary Zone Boundary The history of the primary zone boundary is as follows: the Primary Zone was defined in the 1992 Delta Flood Protection Act by referring to a map attached to the legislation, on file with the Secretary of State. See Public Resources Code section 29728. The map was submitted by the Delta Protection Commission. It is a large extent (small scale) map, with no real controls, little or no reference marks or guides of any kind, and no legal description. As such, from a mapping point of view, it leaves much to be desired. Nevertheless, by law, this map defines the Primary Zone boundary. Sometime shortly after the law was passed, DWR Land & Right of Way drew the boundary on 24k topo maps which also had the precise, agreed-upon legal Delta boundary. There are some significant differences between the DWR version and the official version. In asking current DWR Land & Right of Way staff (Carrol Leong & Fred Mau), there was no readily-available explanation, and the person who originally conducted it is no longer there. That is unfortunate, because not only are these maps much more "accuracy friendly", but there may have been good reasons why the boundary was drawn as such. This is the Delta primary zone boundary. It was drawn by Joel Dudas on November 27, 2002, as described below. It was drawn at the request of Margit Arambru, Delta Protection Commission. The legal Delta/primary zone effort conducted by Chico State had raised questions about the primary zone boundary, and upon inspection of the issue it has been determined that there is no precise solution available at this time. Lori Clamurro & Margit Arambru indicated that this delineation was acceptable to them upon review (12/8/2002). METHOD: There were significant errors in the paper base map, as evidenced by errors in the locations of roads, watercourses, and the legal Delta boundary itself. Due to these significant problems posed by the errors inherent in the paper base map, the base map was used as a guide, rather than as a literal translation, to locate the primary zone boundary. Furthermore, a second significant assumption was made, namely that the intent of the Primary Zone map was to indicate that the legal boundary and the primary zone boundary are one and the same in many places, but that mapping this would not result in distinguishable lines if they were literally drawn atop each other, and they therefore were lined up adjacent to one another (on the source paper map!), with the gap being as small as possible but also being far enough apart to clearly distinguish the two lines. Therefore, for GIS purposes, the shapefile was created by tracing the legal boundary line wherever this was felt to be appropriate. The third major assumption was that, in places where the primary zone and the legal boundary are separated, the primary zone boundary was equivalent to the primary zone boundary drawn by DWR Land & Right of Way on the higher accuracy 24k maps in all places except where significant deviations obviously occurred as indicated by the official paper base map. The rationale for this is that the 24k map does a better job delineating the boundary according to actual features (watercourses, rec district boundaries, etc.) where the intended boundary was clearly the same, but where the paper map simply cannot represent this intent accurately. However, in places where the intent clearly shows a discrepancy from the "higher accuracy" line, the boundary on the paper base map was literally traced. Delta Secondary Zone Boundary The parent of this file was one of the Delta Vision Status & Trends shapefiles. Published in 4/2007. The change to the boundary near Van Sickle was made subsequent to delivery to DWR on 10/8/2009. Also, offsets versus the legal Delta boundary were corrected by DWR on 10/22/2009. At this time, unless better information becomes available, it is therefore felt that these are the best boundaries available.
Facebook
TwitterIndicator 10.3.1Proportion of population reporting having personally felt discriminated against or harassed in the previous 12 months on the basis of a ground of discrimination prohibited under international human rights law.Methodology:Number of survey respondents who felt that they personally experienced discrimination or harassment on one or more prohibited grounds of discrimination during the last 12 months, divided by the total number of survey respondents, multiplied by 100Data Source:National Human Rights Committee.
Facebook
TwitterESYS plc and the Department of Geomatic Engineering at University College London (UCL) have been funded by the British National Space Centre (BNSC) to develop a web GIS service to serve geographic data derived from remote sensing datasets. Funding was provided as part of the BNSC International Co-operation Programme 2 (ICP-2).
Particular aims of the project were to:
use Open Geospatial Consortium (OGC, recently renamed from the OpenGIS Consortium) technologies for map and data serving;
serve datasets for Europe and Africa, particularly Landsat TM and Shuttle Radar Topography Mission (SRTM) digital elevation model (DEM) data;
provide a website giving access to the served data;
provide software scripts, etc., and a document reporting the data processing and software set-up methods developed during the project.
ICEDS was inspired in particular by the Committee on Earth Observing Satellites (CEOS) CEOS Landsat and SRTM Project (CLASP) proposal. An express intention of ICEDS (aim 4 in the list above) was therefore that the solution developed by ESYS and UCL should be redistributable, for example, to other CEOS members. This was taken to mean not only software scripts but also the methods developed by the project team to prepare the data and set up the server. In order to be compatible with aim 4, it was also felt that the use of Open Source, or at least 'free-of-cost' software for the Web GIS serving was an essential component. After an initial survey of the Web GIS packages available at the time , the ICEDS team decided to use the Deegree package, a free software initiative founded by the GIS and Remote Sensing unit of the Department of Geography, University of Bonn , and lat/lon . However the Red Spider web mapping software suite was also provided by IONIC Software - this is a commercial web mapping package but was provided pro bono by IONIC for this project and has been used in parallel to investigate the possibilities and limitations opened up by using a commercial package.
Facebook
TwitterThe files linked to this reference are the geospatial data created as part of the completion of the baseline vegetation inventory project for the NPS park unit. Current format is ArcGIS file geodatabase but older formats may exist as shapefiles. In 2009, the National Park Service (NPS) Vegetation Mapping Inventory funded the NPS South Florida Caribbean Network (SFCN) to evaluate the accuracy of a vegetation map produced by the University of the Virgin Islands (UVI) Eastern Caribbean Center, Conservation Data Center for Virgin Island National Park (VIIS). The UVI vegetation map of VIIS was completed in 2001 and was based on aerial imagery from 1994. VIIS park staff felt that the UVI vegetation map was relatively accurate, but recognized that no formal accuracy assessment of the product had occurred at the time of its creation. Both the UVI and SFCN vegetation maps of VIIS relied on aerial imagery and photo-interpreters to delineate vegetation communities. However, the SFCN vegetation map had the benefit of having LiDAR data available; a technology and data source not readily available when the UVI vegetation map was produced. In addition, the SFCN vegetation map benefited from technological advances in aerial image acquisition that significantly improved the quality and resolution of imagery used; GPS that allowed precise spatial location determination; and GIS science that permitted the viewing, layering, and manipulation of multiple data sources simultaneously. The SFCN vegetation map also benefitted from the use of digital orthophotographs that take into account the surface elevation, topography, of the earth and camera tilt. The UVI vegetation map has an estimated map accuracy of 45.9% with a lower 90th Percentile Confidence Interval of 38.5% while the SFCN vegetation map accuracy is estimated at 87.9% with a lower 90th Percentile Confidence Interval of 82.0%. The SFCN vegetation map has approximately 2.1 times more detail, in the form of individual patches, than the UVI vegetation map does, 1,430 vs 686 patches, respectively. Mean patch size and maximum patch size are smaller in the SFCN vegetation map than in the UVI vegetation map. This results in the SFCN vegetation map being less homogeneous than the UVI map even though the total number of community types mapped are nearly identical, 27 vs 29.
Facebook
TwitterThis dataset contains a modeled dataset showing heat index on the evening (7-8pm CST) of July 27, 2024. This dataset is based off the Urban Heat Island Evening Traverse point dataset. Dataset was created by CAPA Strategies to support the National Oceanic and Atmospheric Administration’s Urban Heat Island Mapping Campaign. *Heat index is an approximation of the heat felt when the presence of humidity is felt in combination with temperature. For this project, the heat index was calculated by combining the temperature with the corresponding relative humidity measurement using the same equations as advised by the National Weather Service (CAPA Strategies, 9/2024)
Facebook
Twitterhttps://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/
Data Science can be applied to any subject or field of study and the paranormal is no exception. Every dataset has a story to tell and our goal with them is to always analyze patterns & trends, create data visualizations, and use statistical methods all in an effort to capture that story. All i ask of you regardless of your beliefs is to keep an open mind when diving into this dataset's controversial topic by finding any possible valuable insights like it's business as usual.
Within this data is the geographical location of a sighting and I've provided the additional variables Season, Month, Year, Time_of_Day. Through my own tedious efforts I found and corrected a great deal of inconsistencies revolving around when the encounter actually occurred because the variable date_reported is not always it and why i felt obligated to increase the accuracy of this information within those variables to the best of my ability. The devil is in the details, if you read the variable **description ** of some of these cryptid encounters you can examine a different date mentioned than the variable date_reported for yourself. Unfortunately there will be blank or NA values for the Date variables in encounters that could not accurately be determined.
There is room for spatial GIS analysis with the location variables **Latitude ** and **Longitude **
There is room for Text Mining from the variable description some ideas I'd suggest looking into are approximate height, Dogman type, and eye color.
Facebook
TwitterA series of London-wide climate risk maps has been produced to analyse climate exposure and vulnerability across Greater London. These maps were produced by Bloomberg Associates in collaboration with the Greater London Authority to help the GLA and other London-based organisations deliver equitable responses to the impacts of climate change and target resources to support communities at highest risk. Climate vulnerability relates to people’s exposure to climate impacts like flooding or heatwaves, but also to personal and social factors that affect their ability to cope with and respond to extreme events. High climate risk coincides with areas of income and health inequalities. A series of citywide maps overlays key metrics to identify areas within London that are most exposed to climate impacts with high concentrations of vulnerable populations. In 2022, Bloomberg Associates updated London’s climate risk maps to include additional data layers at a finer geographic scale (LSOA boundaries). These maps were built upon earlier maps using the Transport for London (Tfl) hexagonal grid (often referred to in this report as the “Hex Grid”). In addition, the map interface was updated to allow users to compare individual data layers to the Overall, Heat and Flooding Climate Risk maps. Users can now also see the specific metrics for each individual LSOA to understand which factors are driving risk throughout the city. In 2024, Bloomberg Associates further modernized the climate risk maps by updating the social factor layers to employ more recent (2021) census data. In addition, air temperature at the surface was used in place of just surface temperature, as a more accurate assessment of felt heat. The Mayor is addressing these climate risks and inequalities through the work of the London Recovery Board, which includes projects and programmes to address climate risks and ensure a green recovery from the pandemic. Ambitious policies in the London Environment Strategy and recently published new London Plan are also addressing London’s climate risks. The data layers at the LSOA level are available here to use in GIS software: Climate risk scores (overall, heat, and flood): https://cityhall.maps.arcgis.com/home/item.html?id=22484ef240624e149735ca1aaa4c9ade# Social and physical risk variables: https://cityhall.maps.arcgis.com/home/item.html?id=bc06d80731f146b393f8631a0f98c213#
Facebook
TwitterThis displays recent earthquake information from the United States Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) program.In addition to displaying earthquakes by magnitude, this web map also provides earthquake impact details. Impact is measured by population as well as models for economic and fatality loss. For more details, see: PAGER Alerts.Events are updated as frequently as every 5 minutes and are available up to 30 days with the following exceptions:Events with a Magnitude LESS than 3.0 are retained for 3 daysEvents with a Magnitude LESS than 4.5 are retained for 7 daysIn addition to event points, ShakeMaps are also provided. These have been dissolved by Shake Intensity to reduce the Layer Complexity.The specific layers provided in this service have been Time Enabled and include:Events by Magnitude: The event’s seismic magnitude value.Contains PAGER Alert Level: USGS PAGER (Prompt Assessment of Global Earthquakes for Response) system provides an automated impact level assignment that estimates fatality and economic loss.Contains Significance Level: An event’s significance is determined by factors like magnitude, max MMI, ‘felt’ reports, and estimated impact.Shake Intensity: The Instrumental Intensity or Modified Mercalli Intensity (MMI) for available events.For field terms and technical details, see: ComCat DocumentationThis map is provided for informational purposes and is not monitored 24/7 for accuracy and currency. Always refer to USGS source for official guidance.How to Use this Web AppThis web app can be used for public information, awareness, and visualization of global earthquakes as a standalone map or embedded in ArcGIS Online apps and dashboards. Map pop-ups contain detailed event information which individually link to each event’s USGS page.The web map for this app is here.There are two articles that walk through this app in greater detail:Earthquake Mapping Part I: One Symbol from Multiple Fields in ArcadeEarthquake Mapping Part II: The Cartography of Time, Magnitude, and Alert LevelsAll events are derived from the same point data and are classified by an event’s Time (Past 24 hours, Past Week, and Past 3 Months), Magnitude (> 4.0 Richter Magnitude), and PAGER Alert Level.This specific layers provided in this web map include:Light BasemapDark BasemapThe shakemaps have been dissolved by a unique value and ordered so that the most intense shaking appears on top. This is achieved by using symbol level drawing.Shake Map
Facebook
TwitterIn addition to displaying earthquakes by magnitude, this service also provide earthquake impact details. Impact is measured by population as well as models for economic and fatality loss. For more details, see: PAGER Alerts. Consumption Best Practices:As a service that is subject to very high usage, ensure peak performance and accessibility of your maps and apps by avoiding the use of non-cache-able relative Date/Time field filters. To accommodate filtering events by Date/Time, we suggest using the included "Age" fields that maintain the number of days or hours since a record was created or last modified, compared to the last service update. These queries fully support the ability to cache a response, allowing common query results to be efficiently provided to users in a high demand service environment.When ingesting this service in your applications, avoid using POST requests whenever possible. These requests can compromise performance and scalability during periods of high usage because they too are not cache-able. Update Frequency: Events are updated as frequently as every 5 minutes and are available up to 30 days with the following exceptions:Events with a Magnitude LESS than 4.5 are retained for 7 daysEvents with a Significance value, "sig" field, of 600 or higher are retained for 90 days In addition to event points, ShakeMaps are also provided. These have been dissolved by Shake Intensity to reduce the Layer Complexity.The specific layers provided in this service have been Time Enabled and include:Events by Magnitude: The event’s seismic magnitude value.Contains PAGER Alert Level: USGS PAGER (Prompt Assessment of Global Earthquakes for Response) system provides an automated impact level assignment that estimates fatality and economic loss.Contains Significance Level: An event’s significance is determined by factors like magnitude, max MMI, ‘felt’ reports, and estimated impact.Shake Intensity: The Instrumental Intensity or Modified Mercalli Intensity (MMI) for available events. For field terms and technical details, see: ComCat Documentation Alternate SymbologiesVisit the Classic USGS Feature Layer item for a Rainbow view of Shakemap features. RevisionsSep 16, 2025: Exposed ‘UniqueId’ field in Shake Intensity Polygon layer.Sep 14, 2025: Upgrade to Layer data update workflow, to improve reliability and scalability.Aug 14, 2024: Added a default Minimum scale suppression of 1:6,000,000 on Shake Intensity layer. Jul 11, 2024: Updated event popup, setting "Tsunami Warning" text to "Alert Possible" when flag is present. Also included hyperlink to tsunami warning center. Feb 13, 2024: Updated feed logic to remove Superseded events This map is provided for informational purposes and is not monitored 24/7 for accuracy and currency. Always refer to USGS source for official guidance. If you would like to be alerted to potential issues or simply see when this Service will update next, please visit our Live Feed Status Page!
Facebook
TwitterThis app is part of Indicators of the Planet. Please see https://livingatlas.arcgis.com/indicatorsIn addition to displaying earthquakes by magnitude, this service also provide earthquake impact details. Impact is measured by population as well as models for economic and fatality loss. For more details, see: PAGER Alerts.Events are updated as frequently as every 5 minutes and are available up to 30 days with the following exceptions:Events with a Magnitude LESS than 3.0 are retained for 3 daysEvents with a Magnitude LESS than 4.5 are retained for 7 daysIn addition to event points, ShakeMaps are also provided. These have been dissolved by Shake Intensity to reduce the Layer Complexity.The specific layers provided in this service have been Time Enabled and include:Events by Magnitude: The event’s seismic magnitude value.Contains PAGER Alert Level: USGS PAGER (Prompt Assessment of Global Earthquakes for Response) system provides an automated impact level assignment that estimates fatality and economic loss.Contains Significance Level: An event’s significance is determined by factors like magnitude, max MMI, ‘felt’ reports, and estimated impact.Shake Intensity: The Instrumental Intensity or Modified Mercalli Intensity (MMI) for available events.For field terms and technical details, see: ComCat DocumentationThis map is provided for informational purposes and is not monitored 24/7 for accuracy and currency. Always refer to USGS source for official guidance.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The Biological Resources Inventory (BRI) consists of 17 related tables, covering biological groupings (fish, invertebrates, plants, terrestrial vertebrates and even habitats), properties, data sources, and survey types. For each biological grouping, there are two tables. One lists all the species within an organismal grouping expected to occur in California (or in the case of habitats, all the habitat types expected to occur in California). The other links to a given species (in its respective organismal grouping) or habitat from the first table with a property number, and in some cases, provides further information about such details as the abundance, season and survey type. As expected there is also a table that links the property numbers used throughout the database with the actual property name. Our research indicates that this list of property names and numbers may not be up to date, nor reflect the names and property numbers used in the current lands inventory (more information about the Lands Inventory is available from Sharon Taylor, Lands Program- Sharon.Taylor@wildlife.ca.govor 916-323-7194). There are also several supporting tables that provide information about interpreting the codes used for abundance, season and survey type. While there are many entries in the BRI, most lack information about who, how, when and why they were obtained. We know data was collected from many sources and ranges in quality from first person direct observations made by CDFW personnel and partners, to regional bird and plant lists, to land management plans, to predicted occurrences from one of several nascent iterations of the California Wildlife Habitat Relationships program. Because of this variation, and our inability to understand the provenance or verify the accuracy of the data, we felt it prudent to only include the highest quality data from the BRI in the Biogeographic Information and Observation System (BIOS). This dataset is a subset of the information contained in the BRI database. It consists of direct observations made by CDFW staff or partners of fish, invertebrates, plants, terrestrial vertebrates and habitats from the BRI. While this is the best and likely most useful data contained in the BRI, we still have little supplemental information about the nature of these direct observations (date of observation, observer, reason for study or survey etc.), and as such urge the user to exercise caution in interpreting these data.
Facebook
TwitterThe assessment activity requires a student to prepare a layout and a written report for the CDEM on who needs to be evacuated from Tauranga and where and how the evacuees would travel to escape a tsunami expected to hit Tauranga. The layout will include maps/images that show where the effects of the tsunami will be felt as well as possible evacuation options
The report will include
The areas of Tauranga that are at risk of a tsunami and reasons why they are at risk. The tsunami safe zones and the problems associated with them. An evacuation plan to explain how to evacuate people from risk areas. This assessment activity requires a student to comprehensively apply spatial analysis, with direction, to solve a geographic problem - Achievement Standard 91014
Facebook
TwitterIn
addition to displaying earthquakes by magnitude, this service also
provide earthquake impact details. Impact is measured by population as
well as models for economic and fatality loss. For more details, see: PAGER Alerts.
Events are updated as
frequently as every 5 minutes and are available up to 30 days with the
following exceptions:
Facebook
TwitterDid You Feel It? (DYFI) collects information from people who felt an earthquake and creates maps that show what people experienced and the extent of damage.
Facebook
TwitterThis dataset contains a modeled dataset showing estimated heat index on the afternoon of (3-4pm CST) on July 27, 2024. This dataset is based off the Urban Heat Island Afternoon Traverse point dataset. Dataset was created by CAPA Strategies to support the National Oceanic and Atmospheric Administration’s Urban Heat Island Mapping Campaign.*Heat index is an approximation of the heat felt when the presence of humidity is felt in combination with temperature. For this project, the heat index was calculated by combining the temperature with the corresponding relative humidity measurement using the same equations as advised by the National Weather Service (CAPA Strategies, 9/2024)
Facebook
TwitterOplysningerne i resuméet er hentet fra rapporten for undersøgelsen:
I september 2007 gennemførte Kulturarvsstyrelsen, UV Öst, en arkæologisk undersøgelse, fase 2, inden for ejendomme Tägneby 3:4 og 4:6, Rystad sogn, Linköping kommune. Undersøgelsen bestod af søgeaksler i både et aktuelt felt og dels inden for et tidligere feltområde, der nu er beplantet med birkeskov.Missionen blev foranlediget af planer om at bygge ni nye villa atomer. På vestsiden, som er gammel agerjord, blev uspoleret jord fundet mellem 0,25-0,4 m fra dagens landområde. Ingen faciliteter, lagre eller fund blev fundet i nogen af akslerne. På østsiden blev uspoleret jord fundet mellem 0,4-0,7 m fra dagens landområde. Et sort brunt lag blev fundet i nogle af akslerne, men blev fortolket som værende af naturlig natur. Der kunne ikke findes faciliteter, strukturer eller fund.
Formål:
Oplysningerne til formålet er taget fra rapporten til undersøgelsen:
Formålet med den arkæologiske undersøgelse var at afgøre, om de pågældende områder indeholder gamle rester eller ej.
Zip-filen indeholder GIS-filer samt en Access-datafil, der indeholder oplysninger om aksler, finder emner, drop-off-typer og andre metadata om den arkæologiske undersøgelse.
Facebook
TwitterThis dataset contains data points along designated routes and captured near-surface air temperature on the morning (6-7am CST) on July 27, 2024. Traverse points were gathered by community volunteers and distributed by CAPA Strategies to support the National Oceanic and Atmospheric Administration’s Urban Heat Island Mapping Campaign.t_f = temperature in Fahrenheitrh = relative humidityhi_f = heat index* in Fahrenheit*Heat index is an approximation of the heat felt when the presence of humidity is felt in combination with temperature. For this project, the heat index was calculated by combining the temperature with the corresponding relative humidity measurement using the same equations as advised by the National Weather Service (CAPA Strategies, 9/2024)
Facebook
TwitterAttribution-NonCommercial-NoDerivs 4.0 (CC BY-NC-ND 4.0)https://creativecommons.org/licenses/by-nc-nd/4.0/
License information was derived automatically
Tsunami evacuation maps containing three zones; red, orange and yellow, corresponding to different threat levels for the development of tsunami evacuation plans, public awareness, self evacuation and official civil defence emergency management or emergency services evacuations in the event of a tsunami. The zones were digitised onto orthophoto coverage based on raw polygon data derived from two tsunami modelling projects. The first of these was a regional modelling project undertaken in 2008 using an empirical 'rule-based' tsunami height attenuation modelling technique. The second project was a more detailed hydrodynamic inundation modelling of Wellington Harbour undertaken in 2015.The regional modelling used an attenuation relationship to model the onshore and upstream runup of tsunami flow to derive the horizontal distances and vertical elevations. The relationship was based on and calibrated with observations of tsunami runup elevations from known events. The relationship is 0.5% height attenuation by distance (i.e. water gets 1.0 m shallower every 200 m inland). The relationship is different for flow over water and up rivers where height attenuation was set at 0.25% (i.e. water gets 1.0 m shallower every 400 m upstream). Only rivers with a width at the mouth over 10 m were modelled in this way. Two zones were modelled with this attenuation relationship; the orange zone and the yellow zone. The red zone was derived by orthophoto analysis, complemented with LiDAR data. The input data for the GIS modelling consisted of a digital elevation model in ArcINFO grid format derived from LiDAR data coverage for the western half of the region (Hutt, Wellington, Porirua, Kapiti) and an interpolated DEM from the 1:50000 LINZ topographic NZMS-260 series for the Wairarapa Coast. Individual grids were generated for significant river and lakes to model the attenuation rule over water and a sea polygon coverage was created from the coastline as an inundation source. The modelling was calibrated with probabilistic tsunami wave heights derived from previously recorded events that have affected the Wellington region and modelling tsunami from known source areas.The second project was a more detailed hydrodynamic inundation modelling undertaken using the COMCOT tsunami model developed by GNS tsunami scientists. The area covered includes Wellington Harbour including Lower Hutt, Eastbourne and east Harbour Bays, Wellington City and south coast bays, Lyall, Houghton,Island and Owhiro Bay.More detailed modelling was possible due to high-resolution topographic and bathymetric data becoming available of the area from projects co-funded by Greater Wellington Regional Council, advances in the understanding of tsunami flows within enclosed harbours and research undertaken by GNS Science on the rupture characteristics of the Hikurangi subduction zone – the offshore plate boundary of the Australian and Pacific tectonic plates. The red zone, also known as the shore exclusion zone, can encompass wave heights up to 1.2 m in height with a 1% annual exceedance probability (i.e. 100 yr return period) from all sources (local, regional and distant). The red zone is to be used when a tsunami forecast suggests a marine threat or a threat only to beaches and coastal infrastructure and facilities. The orange zone was defined using probabilistic wave heights with a 0.2% AEP (i.e. 500 yr return period) from regional (1-3 hr travel time away) and distant sources (>3 hr travel time away). For Wellington Harbour this encompasses tsunami waves heights up to 5.0 m. The orange zone is to be used when a forecast tsunami from a distant source is expected to cause some inundation, but not large enough to require evacuating the yellow zone. The yellow zone was defined using probabilistic wave heights with a 0.04% AEP (i.e. 2500 yr return period) from all possible sources and corresponds to the maximum credible event for the region and up to a 6000 yr return period for Wellington Harbour based on a Mw 9.0 earthquake on the Hikurangi Subduction Zone. The yellow zone is primarily for self-evacuation in the event of a strongly-felt or long-duration earthquake, or when a forecast of a distant-source tsunami of above a specific threat level is issued. Further information can be found in the reports:Leonard, G.S., Power, W., Lukovic, B., Smith, W., Johnston, D. and Downes, G. (2008), Tsunami Evacuation Zones for Wellington and Horizons Regions Defined by a GIS-Calculated Attenuation Rule. GNS Science Report 2008/30, 22 p.Mueller, C., Power, W.L. and Wang, X. (2015), Hydrodynamic Inundation Modelling and Delineation of Tsunami Evacuation Zones for Wellington Harbour. GNS Science Consultancy Report 2015/176, 30 p.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
We’ve been asked to create measures of communities that are “walkable” for several projects. While there is no standard definition of what makes a community “walkable”, and the definition of “walkability” can differ from person to person, we thought an indicator that explores the total length of available sidewalks relative to the total length of streets in a community could be a good place to start. In this blog post, we describe how we used open data from SPC and Allegheny County to create a new measure for how “walkable” a community is. We wanted to create a ratio of the length of a community’s sidewalks to the length of a community’s streets as a measure of pedestrian infrastructure. A ratio of 1 would mean that a community has an equal number of linear feet of sidewalks and streets. A ratio of about 2 would mean that a community has two linear feet of sidewalk for every linear foot of street. In other words, every street has a sidewalk on either side of it. In creating a measure of the ratio of streets to sidewalks, we had to do a little bit of data cleanup. Much of this was by trial and error, ground-truthing the data based on our personal experiences walking in different neighborhoods. Since street data was not shared as open data by many counties in our region either on PASDA or through the SPC open data portal, we limited our analysis of “walkability” to Allegheny County.
In looking at the sidewalk data table and map, we noticed that trails were included. While nice to have in the data, we wanted to exclude these two features from the ratio. We did this to avoid a situation where a community that had few sidewalks but was in the same blockgroup as a park with trails would get “credit” for being more “walkable” than it actually is according to our definition. We did this by removing all segments where “Trail” was in the “Type_Name” field.
We also used a similar tabular selection method to remove crosswalks from the sidewalk data “Type_Name”=”Crosswalk.” We kept the steps in the dataset along with the sidewalks.
In the street data obtained from Allegheny County’s GIS department, we felt like we should try to exclude limited-access highway segments from the analysis, since pedestrians are prohibited from using them, and their presence would have reduced the sidewalk/street ratio in communities where they are located. We did this by excluding street segments whose values in the “FCC” field (designating type of street) equaled “A11” or “A63.” We also removed trails from this dataset by excluding those classified as “H10.” Since documentation was sparse, we looked to see how these features were classified in the data to determine which codes to exclude.
After running the data initially, we also realized that excluding alleyways from the calculations also could improve the accuracy of our results. Some of the communities with substantial pedestrian infrastructure have alleyways, and including them would make them appear to be less-”walkable” in our indicator. We removed these from the dataset by removing records with a value of “Aly” or “Way” in the “St_Type” field. We also excluded streets where the word “Alley” appeared in the street name, or “St_Name” field.
The full methodology used for this dataset is captured in our blog post, and we have also included the sidewalk and street data used to create the ratio here as well.