Facebook
TwitterBuildings are the foundation of any 3D city; they create a realistic visual context for understanding the built environment. This rule can help you quickly create 3D buildings using your existing 2D building footprint polygons. Create buildings for your whole city or specific areas of interest. Use the buildings for context surrounding higher-detail buildings or proposed future developments. Already have existing 3D buildings? Check out the Textured Buildings from Mass by Building Type rule.What you getA Rule Package file named Building_FromFootprint_Textured_ByBuildingType.rpk Rule works with a polygon layerGet startedIn ArcGIS Pro Use this rule to create Procedural Symbols, which are 3D symbols drawn on 2D features Create 3D objects (Multipatch layer) for sharing on the webShare on the web via a Scene LayerIn CityEngineCityEngine File Navigator HelpParametersBuilding Type: Eave_Height: Height from the ground to the eave, units controlled by the Units parameterFloor_Height: Height of each floor, units controlled by the Units parameterRoof_Form: Style of the building roof (Gable, Hip, Flat, Green)Roof_Height: Height from the eave to the top of the roof, units controlled by the Units parameterType: Use activity within the building, this helps in assigning appropriate building texturesDisplay:Color_Override: Setting this to True will allow you to define a specific color using the Override_Color parameter, and will disable photo-texturing.Override_Color: Allows you to specify a building color using the color palette. Note: you must change the Color_Override parameter from False to True for this parameter to take effect.Transparency: Sets the amount of transparency of the feature Units:Units: Controls the measurement units in the rule: Meters | FeetImportant Note: You can hook up the rule parameters to attributes in your data by clicking on the database icon to the right of each rule parameter. The database icon will change to blue when the rule parameter is mapped to an attribute field. The rule will automatically connect when field names match rule parameter names. Use layer files to preserve rule configurations unique to your data.For those who want to know moreThis rule is part of a the 3D Rule Library available in the Living Atlas. Discover more 3D rules to help you perform your work.Learn more about ArcGIS Pro in the Getting to Know ArcGIS Pro lesson
Facebook
TwitterAttribution-NonCommercial-NoDerivs 4.0 (CC BY-NC-ND 4.0)https://creativecommons.org/licenses/by-nc-nd/4.0/
License information was derived automatically
This two-part video tutorial provides a comprehensive, step-by-step guide to creating and georeferencing a 3D textured model of a historic building using Reality Capture. It covers the entire process, from photo alignment and importing GPS data from a text file to identifying and using ground control points (GCPs) to improve the alignment of model components and accurately georeference the point cloud. The tutorial also demonstrates model creation, cleaning, simplification, and texturing. In the final steps, the model is exported and imported into ArcGIS Pro for geographic analysis and visualization.
Facebook
TwitterMature Support Notice: This item is in mature support as of December 2024. A new version of this item is available for your use. Esri recommends updating your maps and apps to use the new version. See blog for more information.This 3D scene layer presents OpenStreetMap (OSM) buildings data hosted by Esri. Esri created buildings and trees scene layers from the OSM Daylight map distribution, which is supported by Facebook and others. The Daylight map distribution has been sunsetted and data updates supporting this layer are no longer available. You can visit openstreetmap.maps.arcgis.com to explore a collection of maps, scenes, and layers featuring OpenStreetMap data in ArcGIS. You can review the 3D Scene Layers Documentation to learn more about how the building and tree features in OSM are modeled and rendered in the 3D scene layers, and see tagging recommendations to get the best results.OpenStreetMap is an open collaborative project to create a free editable map of the world. Volunteers gather location data using GPS, local knowledge, and other free sources of information and upload it. The resulting free map can be viewed and downloaded from the OpenStreetMap site: www.OpenStreetMap.org. Esri is a supporter of the OSM project.Note: This layer is supported in Scene Viewer and ArcGIS Pro 3.0 or higher.
Facebook
TwitterThis dataset contains 50-ft contours for the Hot Springs shallowest unit of the Ouachita Mountains aquifer system potentiometric-surface map. The potentiometric-surface shows altitude at which the water level would have risen in tightly-cased wells and represents synoptic conditions during the summer of 2017. Contours were constructed from 59 water-level measurements measured in selected wells (locations in the well point dataset). Major streams and creeks were selected in the study area from the USGS National Hydrography Dataset (U.S. Geological Survey, 2017), and the spring point dataset with 18 spring altitudes calculated from 10-meter digital elevation model (DEM) data (U.S. Geological Survey, 2015; U.S. Geological Survey, 2016). After collecting, processing, and plotting the data, a potentiometric surface was generated using the interpolation method Topo to Raster in ArcMap 10.5 (Esri, 2017a). This tool is specifically designed for the creation of digital elevation models and imposes constraints that ensure a connected drainage structure and a correct representation of the surface from the provided contour data (Esri, 2017a). Once the raster surface was created, 50-ft contour interval were generated using Contour (Spatial Analyst), a spatial analyst tool (available through ArcGIS 3D Analyst toolbox) that creates a line-feature class of contours (isolines) from the raster surface (Esri, 2017b). The Topo to Raster and contouring done by ArcMap 10.5 is a rapid way to interpolate data, but computer programs do not account for hydrologic connections between groundwater and surface water. For this reason, some contours were manually adjusted based on topographical influence, a comparison with the potentiometric surface of Kresse and Hays (2009), and data-point water-level altitudes to more accurately represent the potentiometric surface. Select References: Esri, 2017a, How Topo to Raster works—Help | ArcGIS Desktop, accessed December 5, 2017, at ArcGIS Pro at http://pro.arcgis.com/en/pro-app/tool-reference/3d-analyst/how-topo-to-raster-works.htm. Esri, 2017b, Contour—Help | ArcGIS Desktop, accessed December 5, 2017, at ArcGIS Pro Raster Surface toolset at http://pro.arcgis.com/en/pro-app/tool-reference/3d-analyst/contour.htm. Kresse, T.M., and Hays, P.D., 2009, Geochemistry, Comparative Analysis, and Physical and Chemical Characteristics of the Thermal Waters East of Hot Springs National Park, Arkansas, 2006-09: U.S. Geological Survey 2009–5263, 48 p., accessed November 28, 2017, at https://pubs.usgs.gov/sir/2009/5263/. U.S. Geological Survey, 2015, USGS NED 1 arc-second n35w094 1 x 1 degree ArcGrid 2015, accessed December 5, 2017, at The National Map: Elevation at https://nationalmap.gov/elevation.html. U.S. Geological Survey, 2016, USGS NED 1 arc-second n35w093 1 x 1 degree ArcGrid 2016, accessed December 5, 2017, at The National Map: Elevation at https://nationalmap.gov/elevation.html.
Facebook
TwitterThis 3D model of Cape Cod structures and street trees was created with planimetrics from 2014. Using a combination of CityEngine, ArcGIS Pro and purchased collada models, the Cape Cod Commission generated this 3D scene to enhance visualization scenarios.
Facebook
TwitterThis deep learning model is used for extracting windows and doors in textured building data displayed in 3D views. Manually digitizing windows/doors from 3D building data can be a slow process. This model automates the extraction of these objects from a 3D view and can help in speeding up 3D editing and analysis workflows. Using this model, existing building data can be enhanced with additional information on location, size and orientation of windows and doors. The extracted windows and doors can be further used to perform 3D visibility analysis using existing 3D geoprocessing tools in ArcGIS.This model can be useful in many industries and workflows. National Government and state-level law enforcement could use this model in security analysis scenarios. Local governments could use windows and door locations to help with tax assessments with CAMA (computer aided mass appraisal) plus impact-studies for urban planning. Public safety users might be interested in regards to physical or visual access to restricted areas, or the ability to build evacuation plans. The commercial sector, with everyone from real-estate agents to advertisers to office/interior designers, would be able to benefit from knowing where windows and doors are located. Even utilities, especially mobile phone providers, could take advantage of knowing window sizes and positions. To be clear, this model doesn't solve these problems, but it does allow users to extract and collate some of the data they will need to do it.Using the modelThis model is generic and is expected to work well with a variety of building styles and shapes. To use this model, you need to install supported deep learning frameworks packages first. See Install deep learning frameworks for ArcGIS for more information. The model can be used with the Interactive Object Detection tool.A blog on the ArcGIS Pro tool that leverages this model is published on Esri Blogs. We've also published steps on how to retrain this model further using your data.InputThe model is expected to work with any textured building data displayed in 3D views. Example data sources include textured multipatches, 3D object scene layers, and integrated mesh layers. OutputFeature class with polygons representing the detected windows and doors in the input imagery. Model architectureThe model uses the FasterRCNN model architecture implemented using ArcGIS API for Python.Training dataThis model was trained using images from the Open Images Dataset.Sample resultsBelow, are sample results of the windows detected with this model in ArcGIS Pro using the Interactive Object Detection tool, which outputs the detected objects as a symbolized point feature class with size and orientation attributes.
Facebook
TwitterOur Co-design team is from the University of Texas, working on a Department of Energy-funded project focused on the Beaumont-Port Arthur area. As part of this project, we will be developing climate-resilient design solutions for areas of the region. More on www.caee.utexas.edu.We captured aerial photos in the Port Arthur Coastal Neighborhood Community and the Golf Course on Pleasure Island, Texas, in June 2024.Aerial photos taken were through DroneDeploy autonomous flight, and models were processed through the DroneDeploy engine as well. All aerial photos are in .JPG format and contained in zipped files for each area.The processed data package includes 3D models, geospatial data, mappings, and point clouds. Please be aware that DTM, Elevation toolbox, Point cloud, and Orthomosaic use EPSG: 6588. And 3D Model uses EPSG: 3857.For using these data:- The Adobe Suite gives you great software to open .Tif files.- You can use LASUtility (Windows), ESRI ArcGIS Pro (Windows), or Blaze3D (Windows, Linux) to open a LAS file and view the data it contains.- Open an .OBJ file with a large number of free and commercial applications. Some examples include Microsoft 3D Builder, Apple Preview, Blender, and Autodesk.- You may use ArcGIS, Merkaartor, Blender (with the Google Earth Importer plug-in), Global Mapper, and Marble to open .KML files.- The .tfw world file is a text file used to georeference the GeoTIFF raster images, like the orthomosaic and the DSM. You need suitable software like ArcView to open a .TFW file.This dataset provides researchers with sufficient geometric data and the status quo of the land surface at the locations mentioned above. This dataset could streamline researchers' decision-making processes and enhance the design as well.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This New Zealand Point Cloud Classification Deep Learning Package will classify point clouds into tree and background classes. This model is optimized to work with New Zealand aerial LiDAR data.The classification of point cloud datasets to identify Trees is useful in applications such as high-quality 3D basemap creation, urban planning, forestry workflows, and planning climate change response.Trees could have a complex irregular geometrical structure that is hard to capture using traditional means. Deep learning models are highly capable of learning these complex structures and giving superior results.This model is designed to extract Tree in both urban and rural area in New Zealand.The Training/Testing/Validation dataset are taken within New Zealand resulting of a high reliability to recognize the pattern of NZ common building architecture.Licensing requirementsArcGIS Desktop - ArcGIS 3D Analyst extension for ArcGIS ProUsing the modelThe model can be used in ArcGIS Pro's Classify Point Cloud Using Trained Model tool. Before using this model, ensure that the supported deep learning frameworks libraries are installed. For more details, check Deep Learning Libraries Installer for ArcGIS.Note: Deep learning is computationally intensive, and a powerful GPU is recommended to process large datasets.InputThe model is trained with classified LiDAR that follows the LINZ base specification. The input data should be similar to this specification.Note: The model is dependent on additional attributes such as Intensity, Number of Returns, etc, similar to the LINZ base specification. This model is trained to work on classified and unclassified point clouds that are in a projected coordinate system, in which the units of X, Y and Z are based on the metric system of measurement. If the dataset is in degrees or feet, it needs to be re-projected accordingly. The model was trained using a training dataset with the full set of points. Therefore, it is important to make the full set of points available to the neural network while predicting - allowing it to better discriminate points of 'class of interest' versus background points. It is recommended to use 'selective/target classification' and 'class preservation' functionalities during prediction to have better control over the classification and scenarios with false positives.The model was trained on airborne lidar datasets and is expected to perform best with similar datasets. Classification of terrestrial point cloud datasets may work but has not been validated. For such cases, this pre-trained model may be fine-tuned to save on cost, time, and compute resources while improving accuracy. Another example where fine-tuning this model can be useful is when the object of interest is tram wires, railway wires, etc. which are geometrically similar to electricity wires. When fine-tuning this model, the target training data characteristics such as class structure, maximum number of points per block and extra attributes should match those of the data originally used for training this model (see Training data section below).OutputThe model will classify the point cloud into the following classes with their meaning as defined by the American Society for Photogrammetry and Remote Sensing (ASPRS) described below: 0 Background 5 Trees / High-vegetationApplicable geographiesThe model is expected to work well in the New Zealand. It's seen to produce favorable results as shown in many regions. However, results can vary for datasets that are statistically dissimilar to training data.Training dataset - Wellington CityTesting dataset - Tawa CityValidation/Evaluation dataset - Christchurch City Dataset City Training Wellington Testing Tawa Validating ChristchurchModel architectureThis model uses the PointCNN model architecture implemented in ArcGIS API for Python.Accuracy metricsThe table below summarizes the accuracy of the predictions on the validation dataset. - Precision Recall F1-score Never Classified 0.991200 0.975404 0.983239 High Vegetation 0.933569 0.975559 0.954102Training dataThis model is trained on classified dataset originally provided by Open TopoGraphy with < 1% of manual labelling and correction.Train-Test split percentage {Train: 80%, Test: 20%} Chosen this ratio based on the analysis from previous epoch statistics which appears to have a descent improvementThe training data used has the following characteristics: X, Y, and Z linear unitMeter Z range-121.69 m to 26.84 m Number of Returns1 to 5 Intensity16 to 65520 Point spacing0.2 ± 0.1 Scan angle-15 to +15 Maximum points per block8192 Block Size20 Meters Class structure[0, 5]Sample resultsModel to classify a dataset with 5pts/m density Christchurch city dataset. The model's performance are directly proportional to the dataset point density and noise exlcuded point clouds.To learn how to use this model, see this story
Facebook
TwitterNew to 3D GIS with @esri and @ArcGISPro?I’ve put together a printable 2-page cheat-sheet of the concepts and terms you need to get started! 😊First, take a quick anatomy lesson to learn about the elements that make up a 3D scene. Not every scene has every element, but you need to know your options.Perhaps most importantly, think about HOW you intend to share your 3D map BEFORE you spend hours (or days) making it. - Tip: If it’s an image or a video, you can spend more time on areas you know the camera will visit, and less on the rest. 3D also has this nasty habit of showing continuous scales (aka levels-of-detail / LODs) throughout the view… You WILL need to think about how scales change off into the distance, as well as choosing the “just-right” LOD for the features you’re showing.Got data with no Z’s? No problem – give them a place to draw by creating an elevation surface. While ‘Ground’ is the most famous, you can also model surfaces: underground; in the air; and based on thematic values.“Paint” your surfaces by draping them with imagery and cartographic content… but don’t forget about the whole continuous-scale thing.Vector content are the “pretty boys” of your 3D map. They give the scene depth and things for people to click on. They can also be high maintenance, both in creation time and performance impact. Use them wisely, and consistently, to avoid a scene that tries to do too much.Vector objects come in all shapes and sizes. Think about how even a simple shape – rotated and resized into place – can communicate information to the user. Everything does NOT* have to look “real”!* (Full Disclosure: sometimes it does).Much like the Ground surface, the exterior shell of vector objects can also be “painted”. The source could be oblique imagery… or procedurally-placed windows and bricks… or even “material” properties that can make a surface appear to be iron or glass or wood. Text in a 3D view can label locations and reinforce the direction a feature is oriented. Make it 3D (where you can) and only drape it on the ground as a last resort… or when you have full camera control (eg: video).Once you’ve symbolized all your layers, you still have more to do – you must also think about the scene as a whole! A scene’s light direction can change the mood, exaggeration can make flat land interesting, and the background color might be critical for the intended use of an exported image.And, finally, if you’re sharing an interactive 3D view, please have a little EMPATHY for your audience. Some of them will be new to 3D and – let’s be honest - navigating around can be hard.If you give them ‘safe places’ (bookmarks / slides) to zoom to if they get lost, they will love your work even more.Hope this helps – good luck with your 3D!-Nathan.
Facebook
TwitterThis deep learning model is used to detect trees in low-resolution drone or aerial imagery. Tree detection can be used for applications such as vegetation management, forestry, urban planning, etc. High resolution aerial and drone imagery can be used for tree detection due to its high spatio-temporal coverage.
This deep learning model is based on MaskRCNN and has been trained on data from the DM Dataset preprocessed and collected by the IST Team.
There is no need of high-resolution imagery you can perform all your analysis on low resolution imagery by detecting the trees with the accuracy of 75% and finetune the model to increase your performance and train on your own data.
Licensing requirements ArcGIS Desktop – ArcGIS Image Analyst and ArcGIS 3D Analyst extensions for ArcGIS Pro ArcGIS Enterprise – ArcGIS Image Server with raster analytics configured ArcGIS Online – ArcGIS Image for ArcGIS Online
Using the model Follow the guide to use the model. Before using this model, ensure that the supported deep learning libraries are installed. For more details, check Deep Learning Libraries Installer for ArcGIS.
Note: Deep learning is computationally intensive, and a powerful GPU is recommended to process large datasets.
Input 3-band low-resolution (70 cm) satellite imagery.
Output Feature class containing detected trees
Applicable geographies The model is expected to work well in the U.A.E.
Model architecture This model is based upon the MaskRCNN python package and uses the Resnet-152 model architecture implemented in pytorch.
Training data This model has been trained on the Satellite Imagery created and Labelled by the team and validated on the different locations with more diverse locations.
Accuracy metrics This model has an average precision score of 0.45.
Sample results Here are a few results from the model.
Facebook
TwitterAttribution-NonCommercial-ShareAlike 3.0 (CC BY-NC-SA 3.0)https://creativecommons.org/licenses/by-nc-sa/3.0/
License information was derived automatically
Focus on Geodatabases in ArcGIS Pro introduces readers to the geodatabase, the comprehensive information model for representing and managing geographic information across the ArcGIS platform.Sharing best practices for creating and maintaining data integrity, chapter topics include the careful design of a geodatabase schema, building geodatabases that include data integrity rules, populating geodatabases with existing data, working with topologies, editing data using various techniques, building 3D views, and sharing data on the web. Each chapter includes important concepts with hands-on, step-by-step tutorials, sample projects and datasets, 'Your turn' segments with less instruction, study questions for classroom use, and an independent project. Instructor resources are available by request.AUDIENCEProfessional and scholarly.AUTHOR BIODavid W. Allen has been working in the GIS field for over 35 years, the last 30 with the City of Euless, Texas, and has seen many versions of ArcInfo and ArcGIS come along since he started with version 5. He spent 18 years as an adjunct professor at Tarrant County College in Fort Worth, Texas, and now serves as the State Director of Operations for a volunteer emergency response group developing databases and templates. Mr. Allen is the author of GIS Tutorial 2: Spatial Analysis Workbook (Esri Press, 2016).Pub Date: Print: 6/17/2019 Digital: 4/29/2019 Format: PaperbackISBN: Print: 9781589484450 Digital: 9781589484467 Trim: 7.5 x 9.25 in.Price: Print: $59.99 USD Digital: $59.99 USD Pages: 260
Facebook
TwitterThis 3D model of Mount Saint Helens shows the topography using wood-textured contours set at 50m vertical spacing, with the darker wood grain color indicating the major contours at 1000, 1500, 2000, and 2500 meters above sea level. The state of the mountain before the eruption of May 13, 1980 is shown with thinner contours, allowing you to see the volume of rock that was ejected via the lateral blast.The process to create the contours uses CityEngine and ArcGIS Pro for data processing, symbolization, and publishing. The steps:Create a rectangular AOI polygon and use the Clip Raster tool on your local terrain raster. A 30m DEM was used for before, 10m for after.Run the Contour tool on the clipped raster, using the polygon output option - 50m was used for this scene.Run the Smooth Polygon tool on the contours. For Mount St. Helens, I used the PAEK algorithm, with a 200m smoothing tolerance. Depending on the resolution of the elevation raster and the extent of the AOI, a larger or smaller value may be needed. Write a CityEngine rule (see below) that extrudes and textures each contour polygon to create a stair-stepped 3D contour map. Provide multiple wood texture options with parameters for: grain size, grain rotation, extrusion height (to account for different contour depths if values other than 100m are used), and a hook for the rule to read the ContourMax attribute that is created by the Contour tool. Export CityEngine rule as a Rule Package (*.rpk).Add some extra features for context - a wooden planter box to hide some of the edges of the model, and water bodies.Apply the CityEngine-authored RPK to the contour polygons in ArcGIS Pro as a procedural fill symbol, adjust parameters for desired look & feel.Run Layer 3D to Feature Class tool to convert the procedural fill to multipatch features. Share Web SceneRather than create a more complicated CityEngine rule that applied textures for light/dark wood colors for minor/major contours, I just created a complete light- and dark-wood version of the mountain (and one with just the water), then shuffled them together.Depending on where this methodology is applied, you may want to clip out other areas - for example, glaciers, roads, or rivers. Or add annotation by inlaying a small north arrow in the corner of the map. I like the challenge of representing any feature in this scene in terms of wood colors and grains - some extruded, some recessed, some inlaid flat.
Facebook
TwitterThe Southeast Texas Urban Integrated field lab’s Co-design team captured aerial photos in the Port Arthur Coastal Neighborhood Community and the Golf Course on Pleasure Island, Texas, in June 2024. Aerial photos taken were through autonomous flight, and models were processed through the DroneDeploy engine. All aerial photos are in .JPG format and contained in zipped files for each area. The processed data package includes 3D models, geospatial data, mappings, and point clouds. Please be aware that DTM, Elevation toolbox, Point Cloud, and Orthomosaic use EPSG: 6588. And 3D Model uses EPSG: 3857.For using these data:- The Adobe Suite gives you great software to open .Tif files.- You can use LASUtility (Windows), ESRI ArcGIS Pro (Windows), or Blaze3D (Windows, Linux) to open a LAS file and view the data it contains.- Open an .OBJ file with a large number of free and commercial applications. Some examples include Microsoft 3D Builder, Apple Preview, Blender, and Autodesk.- You may use ArcGIS, Merkaartor, Blender (with the Google Earth Importer plug-in), Global Mapper, and Marble to open .KML files.- The .tfw world file is a text file used to georeference the GeoTIFF raster images, like the orthomosaic and the DSM. You need suitable softwaremore » like ArcView to open a .TFW file.This dataset provides researchers with sufficient geometric data and the status quo of the land surface at the locations mentioned above. This dataset will support researchers' decision-making processes under uncertainties.« less
Facebook
TwitterThis Web Scene Viewer was created to show climbing routes, climbing anchors, rappel routes, and approach trails in Devils Tower National Monument (DETO). It was created in ArcGIS Pro Scene 2.2, using an integrated approach, combining literature review and collaboration with technical climbing staff. Routes were heads-up digitized over a 3D model of Devils Tower with derived triangulated irregular network (TIN) surfaces created from the DETO Unmanned Aircraft Systems (UAV) Digital Elevation Model (DEM). This dataset utilizes the updated NPS Core Spatial Data Standard Implementation Plan Template dated August 31, 2016. Description last updated November 2018.These data are a representation and are for visual display only. For more information, please visit the Devils Tower National Monument Climbing web page. IRMA Reference
Facebook
TwitterOur Co-design team is from the University of Texas, working on a Department of Energy-funded project focused on the Beaumont-Port Arthur area. As part of this project, we will be developing climate-resilient design solutions for areas of the region. More on www.caee.utexas.edu.We used a DJI Mavic 2 Pro to capture aerial photos in Beaumont-Port Arthur, TX, in February 2023, including:I. Beaumont Soccer ClubII. Corps’ Port Arthur Resident OfficeIII. Halbouty Pump Station comprises its vicinityIV. Lamar University (Including Exxon Power Plants close to Lamar Univ.)V. MLK Boulevard for aerial images of the industry and the ship channelVI. Salt Water Barrier (include some aerial images about the Big Thicket)Aerial photos taken were through DroneDeploy autonomous flight, and models were processed through the DroneDeploy engine as well. All aerial photos are in .JPG format and contained in zipped files for each location.The processed data package including 3D models, geospatial data, mappings, point clouds, and the animation video of Halbouty Pump Station has various file types:- The Adobe Suite gives you great software to open .Tif files.- You can use LASUtility (Windows), ESRI ArcGIS Pro (Windows), or Blaze3D (Windows, Linux) to open a LAS file and view the data it contains.- Open an .OBJ file with a large number of free and commercial applications. Some examples include Microsoft 3D Builder, Apple Preview, Blender, and Autodesk.- You may use ArcGIS, Merkaartor, Blender (with the Google Earth Importer plug-in), Global Mapper, and Marble to open .KML files.- The .tfw world file is a text file used to georeference the GeoTIFF raster images, like the orthomosaic and the DSM. You need suitable software like ArcView to open a .TFW file.This dataset provides researchers with sufficient geometric data and the status quo of the land surface at the locations mentioned above. This dataset could streamline researchers' decision-making processes and enhance the design as well.In October 2023, we had our follow-up data collection, including:I. Beaumont Soccer ClubII. Shipping and Receiving Center at Lamar UniversityAfter the aerial collection, we obtained aerial photos of those two locations mentioned above, as well as processed data (such as point clouds and models).
Facebook
TwitterThe Building Information Model is of the Business and Public Management Center at West Chester University. We created this using AutoCAD drawings and imported them into ArcGIS Pro 2.2 from there we created a 3D representation of the building. Created by Eric Quinn and Tyler LaMantia Supervising Professor Dr. Gary Coutu, West Chester University
Facebook
TwitterThis downloadable zip file contains an ESRI File Geodatabase (FGDB) that is compatible with most versions of ArcGIS Pro, ArcMap, and AutoCAD Map 3D or Civil 3D. To view the geodatabase’s contents, please download the zip file to a local directory and extract its contents. This zipped geodatabase will require approximately 1.38 GB of disc space (1.49 GB extracted). Due to its size, the zip file may take some time to download.This downloadable file geodataase (FGDB) includes Topographic Countours and Spot Elevations derived from LiDAR collected in spring of 2024 by Dewberry Engineers in coordination with Tallahassee - Leon County GIS. The contours were extracted at a 2 foot interval with index contours every 10 feet. Lidar Acquisition Executive SummaryThe primary purpose of this project was to develop a consistent and accurate surface elevation dataset derived from high-accuracy Light Detection and Ranging (lidar) technology for the Tallahassee Leon County Project Area. The lidar data were processed and classified according to project specifications. Detailed breaklines and bare-earth Digital Elevation Models (DEMs) were produced for the project area. Data was formatted according to tiles with each tile covering an area of 5000 ft by 5000 ft. A total of 876 tiles were produced for the project encompassing an area of approximately 785.55 sq. miles. The dataset was created by TLCGIS from lidar data acquired by a Riegl CQ-1560i lidar system from January 14, 2024 through January 19, 2024.ORIGINAL COORDINATE REFERENCE SYSTEMData produced for the project were delivered in the following reference system.Horizontal Datum: The horizontal datum for the project is North American Datum of 1983 with the 2011 Adjustment (NAD 83 (2011))Vertical Datum: The Vertical datum for the project is North American Vertical Datum of 1988 (NAVD88)Coordinate System: NAD83 (2011) State Plane Florida North (US survey feet)Units: Horizontal units are in U.S. Survey Feet, Vertical units are in U.S. Survey Feet.Geiod Model: Geoid12B (Geoid 12B) was used to convert ellipsoid heights to orthometric heights).
Facebook
TwitterThe polyline dataset currently represents 241 climbing routes at Devils Tower National Monument. It was created in ArcGIS Pro Scene 3.1, using an integrated approach combining literature review and collaboration with technical climbing staff. Routes were heads-up digitized over a 3D model of Devils Tower with derived triangulated irregular network (TIN) surfaces created from the DETO Unmanned Aircraft Systems (UAV) Digital Elevation Model (DEM). This dataset utilizes the updated NPS Core Spatial Data Standard Implementation Plan Template dated August 31, 2016. Description last updated January 2020. IRMA Reference
Facebook
TwitterThe point dataset represents 112 climbing anchors at Devils Tower National Monument. It was created in ArcGIS Pro Scene 3.1, using an integrated approach combining literature review and collaboration with technical climbing staff. Routes were heads-up digitized over a 3D model of Devils Tower with derived triangulated irregular network (TIN) surfaces created from the DETO Unmanned Aircraft Systems (UAV) Digital Elevation Model (DEM). This dataset utilizes the updated NPS Core Spatial Data Standard Implementation Plan Template dated August 31, 2016. Description last updated January 2020.IRMA Reference
Facebook
TwitterThe polyline dataset currently represents 5 rappel routes at Devils Tower National Monument. It was created in ArcGIS Pro Scene 3.1, using an integrated approach combining literature review and collaboration with technical climbing staff. Routes were heads-up digitized over a 3D model of Devils Tower with derived triangulated irregular network (TIN) surfaces created from the DETO Unmanned Aircraft Systems (UAV) Digital Elevation Model (DEM). This dataset utilizes the updated NPS Core Spatial Data Standard Implementation Plan Template dated August 31, 2016. Description last updated January 2020.IRMA Reference
Not seeing a result you expected?
Learn how you can add new datasets to our index.
Facebook
TwitterBuildings are the foundation of any 3D city; they create a realistic visual context for understanding the built environment. This rule can help you quickly create 3D buildings using your existing 2D building footprint polygons. Create buildings for your whole city or specific areas of interest. Use the buildings for context surrounding higher-detail buildings or proposed future developments. Already have existing 3D buildings? Check out the Textured Buildings from Mass by Building Type rule.What you getA Rule Package file named Building_FromFootprint_Textured_ByBuildingType.rpk Rule works with a polygon layerGet startedIn ArcGIS Pro Use this rule to create Procedural Symbols, which are 3D symbols drawn on 2D features Create 3D objects (Multipatch layer) for sharing on the webShare on the web via a Scene LayerIn CityEngineCityEngine File Navigator HelpParametersBuilding Type: Eave_Height: Height from the ground to the eave, units controlled by the Units parameterFloor_Height: Height of each floor, units controlled by the Units parameterRoof_Form: Style of the building roof (Gable, Hip, Flat, Green)Roof_Height: Height from the eave to the top of the roof, units controlled by the Units parameterType: Use activity within the building, this helps in assigning appropriate building texturesDisplay:Color_Override: Setting this to True will allow you to define a specific color using the Override_Color parameter, and will disable photo-texturing.Override_Color: Allows you to specify a building color using the color palette. Note: you must change the Color_Override parameter from False to True for this parameter to take effect.Transparency: Sets the amount of transparency of the feature Units:Units: Controls the measurement units in the rule: Meters | FeetImportant Note: You can hook up the rule parameters to attributes in your data by clicking on the database icon to the right of each rule parameter. The database icon will change to blue when the rule parameter is mapped to an attribute field. The rule will automatically connect when field names match rule parameter names. Use layer files to preserve rule configurations unique to your data.For those who want to know moreThis rule is part of a the 3D Rule Library available in the Living Atlas. Discover more 3D rules to help you perform your work.Learn more about ArcGIS Pro in the Getting to Know ArcGIS Pro lesson