Department of Civil Engineering, Carleton University, Ottawa, Ontario, Canada
Abstract: One of the problems with the current practices in the various domains of facility management is that each facility is managed by its stake holder in isolation from the management of other similar facilities. However, with the advent of new technologies such as cloud computing, we have an opportunity to unify the management of multiple geographically dispersed facilities. To that end, this paper presents our joint research efforts on cloud-based smart facility management. More precisely, we present a cloud-based platform in order to manage sensor-based bridge infrastructures and smart machinery. Although the paper focuses on these two applications, the proposed cloud-based platform is designed to support/manage a multitude of smart facilities.
When despair for the world grows in me and I wake in the night at the least sound in fear of what my life and my children’s lives may be, I go and lie down where the wood drake rests in his beauty on the water, and the great heron feeds.
I come into the peace of wild things who do not tax their lives with forethought of grief. I come into the presence of still water. And I feel above me the day-blind stars waiting with their light. For a time I rest in the grace of the world, and am free.
Abílio C. Da Silva Júnior – Aloísio V. Lira Neto – Victor Hugo C. De Albuquerque
Universidade de Fortaleza
.
Roberto Munoz
Universidad de Valparaíso
.
María De Los Ángeles Quezada
Instituto Tecnológico de Tijuana
.
Mohammad Mehedi Hassan
King Saud University
.
Abstract: The scarcity of the planet’s water resources is a concern of several international entities and governments. Smart solutions for water quality monitoring are gaining prominence with advances in communication technology. This work’s primary goal is to develop a new online system to monitor and manage water resources, called Internet of Water Things (IoWT). The proposed system’s objective would be to control and manage raw water resources. Thus, it has developed a platform based on the server-less architecture and Internet of Things Architectural Reference Model, in which it is applied in a simulation environment, considering several electronic devices to validate its performance. For this research, there is a system for capturing raw water from tubular wells. Each well has a level sensor, a temperature sensor and a rain gauge. The data is collected every minute by an electronic device and sent every hour to the IoWT system. From data analysis, the amount of memory allocated to functions minimally interferes with efficiency. The IoWT system, applied in a real case, consists of connecting a device installed in a water well to the platform, where the data is transmitted through a 3G network and then processed. Thus, the proposed approach has great potential to be considered a complementary tool in monitoring raw water and assisting in decision-making for the management of water resources.
Supercomputing plays a crucial role in academic research by providing researchers with the computational power needed to perform complex and data-intensive tasks that are beyond the capabilities of standard computers. These advanced computing systems offer significant benefits and opportunities for researchers across various disciplines. Here are some key roles that supercomputing fulfills in academic research:
Simulation and Modeling: Supercomputers are used to simulate and model complex phenomena that cannot be easily replicated in real-world experiments. This is particularly important in fields like physics, chemistry, climate science, and engineering. Researchers can simulate the behavior of materials, climate patterns, particle interactions, and more, enabling a deeper understanding of natural processes and guiding experimental design.
Big Data Analysis: In many academic disciplines, researchers are dealing with vast amounts of data generated from experiments, observations, or simulations. Supercomputers excel in processing and analyzing big data, extracting valuable insights, and identifying patterns or correlations that would be difficult or impossible to detect using traditional computing resources.
Genomics and Bioinformatics: Supercomputing plays a vital role in genomics and bioinformatics research. Analyzing and comparing genomic data from various species or individuals requires immense computational power. Supercomputers help researchers analyze DNA sequences, identify genes associated with diseases, and explore the complexities of biological systems.
Drug Discovery and Computational Biology: Supercomputers are instrumental in drug discovery and computational biology, where researchers use simulations to understand how drugs interact with target proteins or predict the structure of complex biological molecules. These simulations help in the development of new drugs and therapies.
Astrophysics and Cosmology: Supercomputing is used to simulate the behavior of galaxies, stars, and the universe as a whole. Astrophysicists and cosmologists rely on supercomputers to model the evolution of celestial bodies, study cosmic events, and explore the mysteries of the universe.
Machine Learning and AI Research: Supercomputers accelerate research in artificial intelligence (AI) and machine learning by providing the computational power needed to train large-scale models and algorithms. This is critical for applications like natural language processing, image recognition, and autonomous systems.
Optimization and Data-Driven Decision Making: In various fields, supercomputing enables optimization problems to be solved more efficiently, leading to data-driven decision making. This is relevant in areas such as logistics, transportation, finance, and operations research.
Climate and Environmental Studies: Supercomputers are extensively used in climate and environmental research to model climate change, weather patterns, and the impact of human activities on the environment. These simulations help in understanding and mitigating the effects of global warming and other environmental challenges.
A small, entry-level supercomputer designed for academic or research purposes might cost around $500,000 to $1 million. These systems typically have modest computing power and are used in smaller research institutions or organizations with limited budgets.
Mid-range supercomputers with more significant computational capabilities can cost anywhere from $1 million to $10 million. These systems are often used in larger research institutions, national laboratories, and universities for advanced scientific simulations, big data analysis, and AI research.
At the high end, the most powerful and cutting-edge supercomputers, known as “exascale” systems, can cost several hundred million to over a billion dollars. These machines are at the forefront of technology and are typically used for groundbreaking research in areas like climate modeling, nuclear research, drug discovery, and national security applications.
“Things aren’t all so tangible and sayable as people would usually have us believe; most experiences are unsayable; they happen in a space that no word has ever entered, and more unsayable than all other things are works of art, those mysterious existences, whose life endures beside our own small, transitory life.”
New update alert! The 2022 update to the Trademark Assignment Dataset is now available online. Find 1.29 million trademark assignments, involving 2.28 million unique trademark properties issued by the USPTO between March 1952 and January 2023: https://t.co/njrDAbSpwBpic.twitter.com/GkAXrHoQ9T