Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
License information was derived automatically
This dataset is designed to aid in the analysis and detection of phishing websites. It contains various features that help distinguish between legitimate and phishing websites based on their structural, security, and behavioral attributes.
Result
(Indicates whether a website is phishing or legitimate) Prefix_Suffix
– Checks if the URL contains a hyphen (-
), which is commonly used in phishing domains. double_slash_redirecting
– Detects if the URL redirects using //
, which may indicate a phishing attempt. having_At_Symbol
– Identifies the presence of @
in the URL, which can be used to deceive users. Shortining_Service
– Indicates whether the URL uses a shortening service (e.g., bit.ly, tinyurl). URL_Length
– Measures the length of the URL; phishing URLs tend to be longer. having_IP_Address
– Checks if an IP address is used in place of a domain name, which is suspicious. having_Sub_Domain
– Evaluates the number of subdomains; phishing sites often have excessive subdomains. SSLfinal_State
– Indicates whether the website has a valid SSL certificate (secure connection). Domain_registeration_length
– Measures the duration of domain registration; phishing sites often have short lifespans. age_of_domain
– The age of the domain in days; older domains are usually more trustworthy. DNSRecord
– Checks if the domain has valid DNS records; phishing domains may lack these. Favicon
– Determines if the website uses an external favicon (which can be a sign of phishing). port
– Identifies if the site is using suspicious or non-standard ports. HTTPS_token
– Checks if "HTTPS" is included in the URL but is used deceptively. Request_URL
– Measures the percentage of external resources loaded from different domains. URL_of_Anchor
– Analyzes anchor tags (<a>
links) and their trustworthiness. Links_in_tags
– Examines <meta>
, <script>
, and <link>
tags for external links. SFH
(Server Form Handler) – Determines if form actions are handled suspiciously. Submitting_to_email
– Checks if forms submit data directly to an email instead of a web server. Abnormal_URL
– Identifies if the website’s URL structure is inconsistent with common patterns. Redirect
– Counts the number of redirects; phishing websites may have excessive redirects. on_mouseover
– Checks if the website changes content when hovered over (used in deceptive techniques). RightClick
– Detects if right-click functionality is disabled (phishing sites may disable it). popUpWindow
– Identifies the presence of pop-ups, which can be used to trick users. Iframe
– Checks if the website uses <iframe>
tags, often used in phishing attacks. web_traffic
– Measures the website’s Alexa ranking; phishing sites tend to have low traffic. Page_Rank
– Google PageRank score; phishing sites usually have a low PageRank. Google_Index
– Checks if the website is indexed by Google (phishing sites may not be indexed). Links_pointing_to_page
– Counts the number of backlinks pointing to the website. Statistical_report
– Uses external sources to verify if the website has been reported for phishing. Result
– The classification label (1: Legitimate, -1: Phishing) This dataset is valuable for:
✅ Machine Learning Models – Developing classifiers for phishing detection.
✅ Cybersecurity Research – Understanding patterns in phishing attacks.
✅ Browser Security Extensions – Enhancing anti-phishing tools.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Analysis of ‘Popular Website Traffic Over Time ’ provided by Analyst-2 (analyst-2.ai), based on source dataset retrieved from https://www.kaggle.com/yamqwe/popular-website-traffice on 13 February 2022.
--- Dataset description provided by original source is as follows ---
Background
Have you every been in a conversation and the question comes up, who uses Bing? This question comes up occasionally because people wonder if these sites have any views. For this research study, we are going to be exploring popular website traffic for many popular websites.
Methodology
The data collected originates from SimilarWeb.com.
Source
For the analysis and study, go to The Concept Center
This dataset was created by Chase Willden and contains around 0 samples along with 1/1/2017, Social Media, technical information and other features such as: - 12/1/2016 - 3/1/2017 - and more.
- Analyze 11/1/2016 in relation to 2/1/2017
- Study the influence of 4/1/2017 on 1/1/2017
- More datasets
If you use this dataset in your research, please credit Chase Willden
--- Original source retains full ownership of the source dataset ---
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Code:
Packet_Features_Generator.py & Features.py
To run this code:
pkt_features.py [-h] -i TXTFILE [-x X] [-y Y] [-z Z] [-ml] [-s S] -j
-h, --help show this help message and exit -i TXTFILE input text file -x X Add first X number of total packets as features. -y Y Add first Y number of negative packets as features. -z Z Add first Z number of positive packets as features. -ml Output to text file all websites in the format of websiteNumber1,feature1,feature2,... -s S Generate samples using size s. -j
Purpose:
Turns a text file containing lists of incomeing and outgoing network packet sizes into separate website objects with associative features.
Uses Features.py to calcualte the features.
startMachineLearning.sh & machineLearning.py
To run this code:
bash startMachineLearning.sh
This code then runs machineLearning.py in a tmux session with the nessisary file paths and flags
Options (to be edited within this file):
--evaluate-only to test 5 fold cross validation accuracy
--test-scaling-normalization to test 6 different combinations of scalers and normalizers
Note: once the best combination is determined, it should be added to the data_preprocessing function in machineLearning.py for future use
--grid-search to test the best grid search hyperparameters - note: the possible hyperparameters must be added to train_model under 'if not evaluateOnly:' - once best hyperparameters are determined, add them to train_model under 'if evaluateOnly:'
Purpose:
Using the .ml file generated by Packet_Features_Generator.py & Features.py, this program trains a RandomForest Classifier on the provided data and provides results using cross validation. These results include the best scaling and normailzation options for each data set as well as the best grid search hyperparameters based on the provided ranges.
Data
Encrypted network traffic was collected on an isolated computer visiting different Wikipedia and New York Times articles, different Google search queres (collected in the form of their autocomplete results and their results page), and different actions taken on a Virtual Reality head set.
Data for this experiment was stored and analyzed in the form of a txt file for each experiment which contains:
First number is a classification number to denote what website, query, or vr action is taking place.
The remaining numbers in each line denote:
The size of a packet,
and the direction it is traveling.
negative numbers denote incoming packets
positive numbers denote outgoing packets
Figure 4 Data
This data uses specific lines from the Virtual Reality.txt file.
The action 'LongText Search' refers to a user searching for "Saint Basils Cathedral" with text in the Wander app.
The action 'ShortText Search' refers to a user searching for "Mexico" with text in the Wander app.
The .xlsx and .csv file are identical
Each file includes (from right to left):
The origional packet data,
each line of data organized from smallest to largest packet size in order to calculate the mean and standard deviation of each packet capture,
and the final Cumulative Distrubution Function (CDF) caluclation that generated the Figure 4 Graph.
Altosight | AI Custom Web Scraping Data
✦ Altosight provides global web scraping data services with AI-powered technology that bypasses CAPTCHAs, blocking mechanisms, and handles dynamic content.
We extract data from marketplaces like Amazon, aggregators, e-commerce, and real estate websites, ensuring comprehensive and accurate results.
✦ Our solution offers free unlimited data points across any project, with no additional setup costs.
We deliver data through flexible methods such as API, CSV, JSON, and FTP, all at no extra charge.
― Key Use Cases ―
➤ Price Monitoring & Repricing Solutions
🔹 Automatic repricing, AI-driven repricing, and custom repricing rules 🔹 Receive price suggestions via API or CSV to stay competitive 🔹 Track competitors in real-time or at scheduled intervals
➤ E-commerce Optimization
🔹 Extract product prices, reviews, ratings, images, and trends 🔹 Identify trending products and enhance your e-commerce strategy 🔹 Build dropshipping tools or marketplace optimization platforms with our data
➤ Product Assortment Analysis
🔹 Extract the entire product catalog from competitor websites 🔹 Analyze product assortment to refine your own offerings and identify gaps 🔹 Understand competitor strategies and optimize your product lineup
➤ Marketplaces & Aggregators
🔹 Crawl entire product categories and track best-sellers 🔹 Monitor position changes across categories 🔹 Identify which eRetailers sell specific brands and which SKUs for better market analysis
➤ Business Website Data
🔹 Extract detailed company profiles, including financial statements, key personnel, industry reports, and market trends, enabling in-depth competitor and market analysis
🔹 Collect customer reviews and ratings from business websites to analyze brand sentiment and product performance, helping businesses refine their strategies
➤ Domain Name Data
🔹 Access comprehensive data, including domain registration details, ownership information, expiration dates, and contact information. Ideal for market research, brand monitoring, lead generation, and cybersecurity efforts
➤ Real Estate Data
🔹 Access property listings, prices, and availability 🔹 Analyze trends and opportunities for investment or sales strategies
― Data Collection & Quality ―
► Publicly Sourced Data: Altosight collects web scraping data from publicly available websites, online platforms, and industry-specific aggregators
► AI-Powered Scraping: Our technology handles dynamic content, JavaScript-heavy sites, and pagination, ensuring complete data extraction
► High Data Quality: We clean and structure unstructured data, ensuring it is reliable, accurate, and delivered in formats such as API, CSV, JSON, and more
► Industry Coverage: We serve industries including e-commerce, real estate, travel, finance, and more. Our solution supports use cases like market research, competitive analysis, and business intelligence
► Bulk Data Extraction: We support large-scale data extraction from multiple websites, allowing you to gather millions of data points across industries in a single project
► Scalable Infrastructure: Our platform is built to scale with your needs, allowing seamless extraction for projects of any size, from small pilot projects to ongoing, large-scale data extraction
― Why Choose Altosight? ―
✔ Unlimited Data Points: Altosight offers unlimited free attributes, meaning you can extract as many data points from a page as you need without extra charges
✔ Proprietary Anti-Blocking Technology: Altosight utilizes proprietary techniques to bypass blocking mechanisms, including CAPTCHAs, Cloudflare, and other obstacles. This ensures uninterrupted access to data, no matter how complex the target websites are
✔ Flexible Across Industries: Our crawlers easily adapt across industries, including e-commerce, real estate, finance, and more. We offer customized data solutions tailored to specific needs
✔ GDPR & CCPA Compliance: Your data is handled securely and ethically, ensuring compliance with GDPR, CCPA and other regulations
✔ No Setup or Infrastructure Costs: Start scraping without worrying about additional costs. We provide a hassle-free experience with fast project deployment
✔ Free Data Delivery Methods: Receive your data via API, CSV, JSON, or FTP at no extra charge. We ensure seamless integration with your systems
✔ Fast Support: Our team is always available via phone and email, resolving over 90% of support tickets within the same day
― Custom Projects & Real-Time Data ―
✦ Tailored Solutions: Every business has unique needs, which is why Altosight offers custom data projects. Contact us for a feasibility analysis, and we’ll design a solution that fits your goals
✦ Real-Time Data: Whether you need real-time data delivery or scheduled updates, we provide the flexibility to receive data when you need it. Track price changes, monitor product trends, or gather...
https://www.archivemarketresearch.com/privacy-policyhttps://www.archivemarketresearch.com/privacy-policy
The custom website design market is projected to reach a value of 81380 million in 2025, exhibiting a CAGR of 8.4% during the forecast period (2025-2033). The increasing demand for personalized and user-centric websites, coupled with the proliferation of e-commerce, has been driving market growth. Furthermore, the rising adoption of mobile devices and the growing number of internet users have spurred the demand for responsive and mobile-friendly website designs. The key drivers of the custom website design market include the increasing number of businesses establishing an online presence, the demand for improved user experience and engagement, and the growing adoption of cloud-based design tools. The market is segmented based on application (enterprise and individual) and type (independent team and crowdsourcing model). North America and Europe are major regional markets for custom website design, with significant contributions from the United States, Canada, Germany, and the United Kingdom. The market is characterized by the presence of both established players and emerging startups, with key vendors including Web.com, Digital.com, DreamHost, Thrive Internet Marketing Agency, and crowdspring.
https://www.ibisworld.com/about/termsofuse/https://www.ibisworld.com/about/termsofuse/
Web design service companies have experienced significant growth over the past few years, driven by the expanding use of the Internet. As online operations have become more widespread, businesses and consumers have increasingly recognized the importance of maintaining an online presence, leading to robust demand for web design services and boosting the industry’s profit. The rise in broadband connections and online business activities further spotlight this trend, making web design a vital component of modern commerce and communication. This solid foundation suggests the industry has been thriving despite facing some economic turbulence related to global events and shifting financial climates. Over the past few years, web design companies have navigated a dynamic landscape marked by both opportunities and challenges. Strong economic conditions have typically favored the industry, with rising disposable incomes and low unemployment rates encouraging both consumers and businesses to invest in professional web design. Despite this, the sector also faced hurdles such as high inflation, which made cost increases necessary and pushed some customers towards cheaper substitutes such as website templates and in-house production, causing a slump in revenue in 2022. Despite these obstacles, the industry has demonstrated resilience against rising interest rates and economic uncertainties by focusing on enhancing user experience and accessibility. Overall, revenue for web design service companies is anticipated to rise at a CAGR of 2.2% during the current period, reaching $43.5 billion in 2024. This includes a 2.2% jump in revenue in that year. Looking ahead, web design companies will continue to do well, as the strong performance of the US economy will likely support ongoing demand for web design services, bolstered by higher consumer spending and increased corporate profit. On top of this, government investment, especially at the state and local levels, will provide further revenue streams as public agencies seek to upgrade their web presence. Innovation remains key, with a particular emphasis on designing for mobile devices as more activities shift to on-the-go platforms. Companies that can effectively adapt to these trends and invest in new technologies will likely capture a significant market share, fostering an environment where entry remains feasible yet competitive. Overall, revenue for web design service providers is forecast to swell at a CAGR of 1.9% during the outlook period, reaching $47.7 billion in 2029.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Analysis of ‘Website Analytics’ provided by Analyst-2 (analyst-2.ai), based on source dataset retrieved from https://catalog.data.gov/dataset/ecee4df3-8149-4b74-8927-428ea920b758 on 13 February 2022.
--- Dataset description provided by original source is as follows ---
Web traffic statistics for the several City-Parish websites, brla.gov, city.brla.gov, Red Stick Ready, GIS, Open Data etc. Information provided by Google Analytics.
--- Original source retains full ownership of the source dataset ---
Analysis of the websites and ICT uses of Walloon municipalities
Dit bestand bevat de data die zijn verzameld in het kader van het proefschrift van Sanne Elling: ‘Evaluating website quality: Five studies on user-focused evaluation methods’.
Summary:
The benefits of evaluating websites among potential users are widely acknowledged. There are several methods that can be used to evaluate the websites’ quality from a users’ perspective. In current practice, many evaluations are executed with inadequate methods that lack research-based validation. This thesis aims to gain more insight into evaluation methodology and to contribute to a higher standard of website evaluation in practice. A first way to evaluate website quality is measuring the users’ opinions. This is often done with questionnaires, which gather opinions in a cheap, fast, and easy way. However, many questionnaires seem to miss a solid statistical basis and a justification of the choice of quality dimensions and questions. We therefore developed the ‘Website Evaluation Questionnaire’ (WEQ), which was specifically designed for the evaluation of governmental websites. In a study in online and laboratory settings the WEQ has proved to be a valid and reliable instrument. A way to gather more specific user opinions, is inviting participants to review website pages. Participants provide their comments by clicking on a feedback button, marking a problematic segment, and formulating their feedback.
There has been debate about the extent to which users are able to provide relevant feedback. The results of our studies showed that participants were able to provide useful feedback. They signalled many relevant problems that indeed were experienced by users who needed to find information on the website. Website quality can also be measured during participants’ task performance. A frequently used method is the concurrent think-aloud method (CTA), which involves participants who verbalize their thoughts while performing tasks. There have been doubts on the usefulness and exhaustiveness of participants’ verbalizations. Therefore, we have combined CTA and eye tracking in order to examine the cognitive processes that participants do and do not verbalize. The results showed that the participants’ verbalizations provided substantial information in addition to the directly observable user problems. There was also a rather high percentage of silences (27%) during which interesting observations could be made about the users’ processes and obstacles. A thorough evaluation should therefore combine verbalizations and (eye tracking) observations. In a retrospective think-aloud (RTA) evaluation participants verbalize their thoughts afterwards while watching a recording of their performance. A problem with RTA is that participants not always remember the thoughts they had during their task performance. We therefore complemented the dynamic screen replay of their actions (pages visited and mouse movements) with a dynamic gaze replay of the participants’ eye movements.
Contrary to our expectations, no differences were found between the two conditions. It is not possible to draw conclusions on the single best method. The value of a specific method is strongly influenced by the goals and context of an evaluation. Also, the outcomes of the evaluation not only depend on the method, but also on other choices during the evaluation, such as participant selection, tasks, and the subsequent analysis.
Unlock the Potential of Your Web Traffic with Advanced Data Resolution
In the digital age, understanding and leveraging web traffic data is crucial for businesses aiming to thrive online. Our pioneering solution transforms anonymous website visits into valuable B2B and B2C contact data, offering unprecedented insights into your digital audience. By integrating our unique tag into your website, you unlock the capability to convert 25-50% of your anonymous traffic into actionable contact rows, directly deposited into an S3 bucket for your convenience. This process, known as "Web Traffic Data Resolution," is at the forefront of digital marketing and sales strategies, providing a competitive edge in understanding and engaging with your online visitors.
Comprehensive Web Traffic Data Resolution Our product stands out by offering a robust solution for "Web Traffic Data Resolution," a process that demystifies the identities behind your website traffic. By deploying a simple tag on your site, our technology goes to work, analyzing visitor behavior and leveraging proprietary data matching techniques to reveal the individuals and businesses behind the clicks. This innovative approach not only enhances your data collection but does so with respect for privacy and compliance standards, ensuring that your business gains insights ethically and responsibly.
Deep Dive into Web Traffic Data At the core of our solution is the sophisticated analysis of "Web Traffic Data." Our system meticulously collects and processes every interaction on your site, from page views to time spent on each section. This data, once anonymous and perhaps seen as abstract numbers, is transformed into a detailed ledger of potential leads and customer insights. By understanding who visits your site, their interests, and their contact information, your business is equipped to tailor marketing efforts, personalize customer experiences, and streamline sales processes like never before.
Benefits of Our Web Traffic Data Resolution Service Enhanced Lead Generation: By converting anonymous visitors into identifiable contact data, our service significantly expands your pool of potential leads. This direct enhancement of your lead generation efforts can dramatically increase conversion rates and ROI on marketing campaigns.
Targeted Marketing Campaigns: Armed with detailed B2B and B2C contact data, your marketing team can create highly targeted and personalized campaigns. This precision in marketing not only improves engagement rates but also ensures that your messaging resonates with the intended audience.
Improved Customer Insights: Gaining a deeper understanding of your web traffic enables your business to refine customer personas and tailor offerings to meet market demands. These insights are invaluable for product development, customer service improvement, and strategic planning.
Competitive Advantage: In a digital landscape where understanding your audience can make or break your business, our Web Traffic Data Resolution service provides a significant competitive edge. By accessing detailed contact data that others in your industry may overlook, you position your business as a leader in customer engagement and data-driven strategies.
Seamless Integration and Accessibility: Our solution is designed for ease of use, requiring only the placement of a tag on your website to start gathering data. The contact rows generated are easily accessible in an S3 bucket, ensuring that you can integrate this data with your existing CRM systems and marketing tools without hassle.
How It Works: A Closer Look at the Process Our Web Traffic Data Resolution process is streamlined and user-friendly, designed to integrate seamlessly with your existing website infrastructure:
Tag Deployment: Implement our unique tag on your website with simple instructions. This tag is lightweight and does not impact your site's loading speed or user experience.
Data Collection and Analysis: As visitors navigate your site, our system collects web traffic data in real-time, analyzing behavior patterns, engagement metrics, and more.
Resolution and Transformation: Using advanced data matching algorithms, we resolve the collected web traffic data into identifiable B2B and B2C contact information.
Data Delivery: The resolved contact data is then securely transferred to an S3 bucket, where it is organized and ready for your access. This process occurs daily, ensuring you have the most up-to-date information at your fingertips.
Integration and Action: With the resolved data now in your possession, your business can take immediate action. From refining marketing strategies to enhancing customer experiences, the possibilities are endless.
Security and Privacy: Our Commitment Understanding the sensitivity of web traffic data and contact information, our solution is built with security and privacy at its core. We adhere to strict data protection regulat...
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
List of 504,038 domains of Italy found to contain Google Analytics.
The front page for Italy-related domain names has been accessed through HTTPS or HTTP and analysed with webbkoll and jq to gather data about third-party requests, cookies and other privacy-invasive features. Together with the actual URL visited, the user/property ID is provided for 495,663 domains (extracted either from the cookies deposited or the URL of requests to Google Analytics). MX and TXT records for the domains are also provided.
The most common ID found was 23LNSPS7Q6, with over 35k domains calling it (seemingly associated with italiaonline.it). The most common responding IP addresses were 3 AWS IPv4 addresses (over 40k domains) and 2 CloudFlare IPv6 addresses (over 12k domains).
https://www.marketresearchforecast.com/privacy-policyhttps://www.marketresearchforecast.com/privacy-policy
The global website visitor tracking software market is experiencing robust growth, driven by the increasing need for businesses to understand online customer behavior and optimize their digital strategies. The market, estimated at $5 billion in 2025, is projected to expand at a Compound Annual Growth Rate (CAGR) of 15% from 2025 to 2033, reaching approximately $15 billion by 2033. This expansion is fueled by several key factors, including the rising adoption of digital marketing strategies, the growing importance of data-driven decision-making, and the increasing sophistication of website visitor tracking tools. Cloud-based solutions dominate the market due to their scalability, accessibility, and cost-effectiveness, particularly appealing to Small and Medium-sized Enterprises (SMEs). However, large enterprises continue to invest significantly in on-premise solutions for enhanced data security and control. The market is highly competitive, with numerous established players and emerging startups offering a range of features and functionalities. Technological advancements, such as AI-powered analytics and enhanced integration with other marketing tools, are shaping the future of the market. The market's geographical distribution reflects the global digital landscape. North America, with its mature digital economy and high adoption rates, holds a significant market share. However, regions like Asia-Pacific are showing rapid growth, driven by increasing internet penetration and digitalization across various industries. Despite the overall positive outlook, challenges such as data privacy regulations and the increasing complexity of website tracking technology are influencing market dynamics. The ongoing competition among vendors necessitates continuous innovation and the development of more user-friendly and insightful tools. The future growth of the website visitor tracking software market is promising, fueled by the continuing importance of data-driven decision-making within marketing and business strategies. A key factor will be the ongoing adaptation to evolving privacy regulations and user expectations.
https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy
Market Overview: The global PC website builder market is projected to reach a value of USD XX million by 2033, exhibiting a CAGR of XX% during the forecast period. This growth is primarily driven by the increasing demand for user-friendly and cost-effective website creation solutions for businesses and individuals. The proliferation of e-commerce, digital marketing, and the growing number of online users have further contributed to the market expansion. Market Drivers and Trends: Key drivers of the market include the increasing accessibility of web development tools, advancements in artificial intelligence (AI) and machine learning (ML), and the growing adoption of mobile-responsive website design. Additionally, the shift towards remote work and hybrid work models has further increased the demand for website builders that offer flexibility and ease of use. Notable trends include the integration of AI-powered features, the emergence of cloud-based solutions, and the growing popularity of code-free website development platforms.
https://www.archivemarketresearch.com/privacy-policyhttps://www.archivemarketresearch.com/privacy-policy
The global website maintenance and support services market is expected to grow from USD 5565 million in 2025 to USD 9820 million by 2033, at a CAGR of 8.2% during the forecast period. The increasing adoption of websites by businesses and the growing need for ensuring their uptime and performance are driving the market growth. The rising complexity of websites and the need for regular updates and security patches are also contributing to the market demand. The major drivers of the Website Maintenance and Support Services market include the increasing number of websites, rising demand for website security, and growing use of mobile devices. Additionally, the evolving regulatory landscape and the need for compliance with industry standards are also driving the market growth. However, factors such as the availability of open-source website development tools and the trend towards do-it-yourself website maintenance may restrain the market growth to some extent. The market is segmented by type (monthly fee model, project fee model, and others) and by application (bank and financial institution websites, campus and government websites, internet and e-commerce websites, listed company websites, and others). The monthly fee model is expected to hold the largest market share during the forecast period, owing to the increasing adoption of subscription-based services.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Abstract The results derived from a content analysis applied to websites from 308 municipalities in Portugal are discussed. Despite the name of the method (which could possibly lead to confusion), the study has exclusively focused on the formal parameters of online communication; that is, audiovisual appearance, information architecture, usability, accessibility, and features of Web 2.0 present in the websites of the sample. As the main strategy for data processing, a Formal Quality Index was created from the computation of certain items in the codebook; a procedure adapted from previous research was satisfactorily replicated in the present study. This indicator was used to make interregional comparisons, which showed statistically significant differences, and it was also correlated with factors such as population, budget, purchasing power, or technological development of municipalities; all these were predictors of the formal status of the local Portuguese e-Administration.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This research had the purpose of investigating the utilization of a company's official website as a communication medium and for online branding purposes in the telecommunication industry in Indonesia. This study used a population of official websites from seven telecommunication companies in Indonesia. The indicators that had been used to analyze website content, website ease of use, and website interactivity. The results of this study showed that 100% of telecommunication companies in Indonesia had a website that can be used as an online branding strategy. The indicators that were mostly used by these companies for online branding were website content (100%) and website ease of use (93.65%), while the use of website interactivity (53%) still needed to be maximized. The findings of this study were useful in revealing how telecommunications companies in developing countries used their websites as a communication medium for online branding.
https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy
The global recipe website market is projected to reach USD 10.5 billion by 2033, exhibiting a CAGR of 6.5% during the forecast period from 2025 to 2033. The growth of the market is primarily driven by the increasing popularity of online cooking and the convenience of accessing recipes from a variety of sources. Additionally, the widespread use of smartphones and tablets has made it easier than ever for people to find and follow recipes while cooking. The market is segmented into various segments based on application, type, and region. Dessert recipes accounted for the highest market share in 2025. The rising demand for indulgent desserts and the availability of a wide variety of dessert recipes online are key factors contributing to this segment's growth. In terms of type, video recipes are projected to witness the fastest growth during the forecast period. The increasing popularity of video content and the ease of following video instructions while cooking are major factors driving the growth of this segment. North America is anticipated to hold the largest market share throughout the forecast period. The presence of well-established recipe websites and the high adoption of online cooking in the region are key factors supporting the market growth in this region.
https://www.ibisworld.com/about/termsofuse/https://www.ibisworld.com/about/termsofuse/
Website creation software developers have become more popular as the world has become more digital. As such trends have been happening since the dawn of the internet, the need for websites has gone up, helping this industry out. More efforts in expanding internet access through broadband numbers going up have also been helping this industry. Companies need websites to market their services and products for those browsing online, as a higher number of those online boosts the number of those who need and will be using such type of software to be more dialed in on such trends. Revenue has gone up by a CAGR of 7.1% through the end of 2024, reaching $14.8 billion, including a 2.1% rise in 2023 alone. More consumers and businesses are moving online, fueling the need for websites to handle such activity. The difficulties of making a website for those who aren't tech-savvy have been helping this industry because of its ready-to-deploy software that can be downloaded on the spot. Remote work has also been giving rise to how much business activity is done online, boosting the need for websites to capture such activity for those browsing the web more than ever. High costs have been a bane for this industry; the need for a talented workforce remains important. As such, profit has gone down during this period. Online services are expected to become increasingly integrated into daily life through 2029. New features will necessitate more website updates, as companies need to update their websites. As individual saturation with the internet expands, companies must find new ways to generate more revenue. Hikes in subscription fees will be one way that companies enhance their market positions. Overall, industry revenue is expected to grow at a CAGR of 2.4% through 2028, reaching $17.2 billion.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Analysis of ‘New York State Locality Hierarchy with Websites’ provided by Analyst-2 (analyst-2.ai), based on source dataset retrieved from https://catalog.data.gov/dataset/113a63bc-1105-4ad6-aedf-9cc1dcdd9b32 on 27 January 2022.
--- Dataset description provided by original source is as follows ---
The dataset contains a hierarchal listing of New York State counties, cities, towns, and villages, as well as official locality websites
--- Original source retains full ownership of the source dataset ---
https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy
The recipe website market is experiencing significant growth, driven by the increasing popularity of home cooking, the convenience of online recipe access, and the rise of social media platforms that foster recipe sharing. In 2025, the market was valued at million and is projected to reach a staggering million by 2033, exhibiting a CAGR of XX% during the forecast period. Key trends driving this growth include the integration of artificial intelligence and machine learning into recipe websites, providing personalized recommendations and improved search functionality. Additionally, the emergence of meal delivery services and the increasing focus on healthy eating are fueling the demand for reliable and accessible recipe resources. Major players in the market include AllRecipes, FoodNetwork, Genius Kitchen, and Yummly, who continue to enhance their offerings and engage with users through social media and community forums.
Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
License information was derived automatically
This dataset is designed to aid in the analysis and detection of phishing websites. It contains various features that help distinguish between legitimate and phishing websites based on their structural, security, and behavioral attributes.
Result
(Indicates whether a website is phishing or legitimate) Prefix_Suffix
– Checks if the URL contains a hyphen (-
), which is commonly used in phishing domains. double_slash_redirecting
– Detects if the URL redirects using //
, which may indicate a phishing attempt. having_At_Symbol
– Identifies the presence of @
in the URL, which can be used to deceive users. Shortining_Service
– Indicates whether the URL uses a shortening service (e.g., bit.ly, tinyurl). URL_Length
– Measures the length of the URL; phishing URLs tend to be longer. having_IP_Address
– Checks if an IP address is used in place of a domain name, which is suspicious. having_Sub_Domain
– Evaluates the number of subdomains; phishing sites often have excessive subdomains. SSLfinal_State
– Indicates whether the website has a valid SSL certificate (secure connection). Domain_registeration_length
– Measures the duration of domain registration; phishing sites often have short lifespans. age_of_domain
– The age of the domain in days; older domains are usually more trustworthy. DNSRecord
– Checks if the domain has valid DNS records; phishing domains may lack these. Favicon
– Determines if the website uses an external favicon (which can be a sign of phishing). port
– Identifies if the site is using suspicious or non-standard ports. HTTPS_token
– Checks if "HTTPS" is included in the URL but is used deceptively. Request_URL
– Measures the percentage of external resources loaded from different domains. URL_of_Anchor
– Analyzes anchor tags (<a>
links) and their trustworthiness. Links_in_tags
– Examines <meta>
, <script>
, and <link>
tags for external links. SFH
(Server Form Handler) – Determines if form actions are handled suspiciously. Submitting_to_email
– Checks if forms submit data directly to an email instead of a web server. Abnormal_URL
– Identifies if the website’s URL structure is inconsistent with common patterns. Redirect
– Counts the number of redirects; phishing websites may have excessive redirects. on_mouseover
– Checks if the website changes content when hovered over (used in deceptive techniques). RightClick
– Detects if right-click functionality is disabled (phishing sites may disable it). popUpWindow
– Identifies the presence of pop-ups, which can be used to trick users. Iframe
– Checks if the website uses <iframe>
tags, often used in phishing attacks. web_traffic
– Measures the website’s Alexa ranking; phishing sites tend to have low traffic. Page_Rank
– Google PageRank score; phishing sites usually have a low PageRank. Google_Index
– Checks if the website is indexed by Google (phishing sites may not be indexed). Links_pointing_to_page
– Counts the number of backlinks pointing to the website. Statistical_report
– Uses external sources to verify if the website has been reported for phishing. Result
– The classification label (1: Legitimate, -1: Phishing) This dataset is valuable for:
✅ Machine Learning Models – Developing classifiers for phishing detection.
✅ Cybersecurity Research – Understanding patterns in phishing attacks.
✅ Browser Security Extensions – Enhancing anti-phishing tools.