5 resultados para Open Business Model
em Digital Commons - Michigan Tech
Resumo:
Virtually every sector of business and industry that uses computing, including financial analysis, search engines, and electronic commerce, incorporate Big Data analysis into their business model. Sophisticated clustering algorithms are popular for deducing the nature of data by assigning labels to unlabeled data. We address two main challenges in Big Data. First, by definition, the volume of Big Data is too large to be loaded into a computer’s memory (this volume changes based on the computer used or available, but there is always a data set that is too large for any computer). Second, in real-time applications, the velocity of new incoming data prevents historical data from being stored and future data from being accessed. Therefore, we propose our Streaming Kernel Fuzzy c-Means (stKFCM) algorithm, which reduces both computational complexity and space complexity significantly. The proposed stKFCM only requires O(n2) memory where n is the (predetermined) size of a data subset (or data chunk) at each time step, which makes this algorithm truly scalable (as n can be chosen based on the available memory). Furthermore, only 2n2 elements of the full N × N (where N >> n) kernel matrix need to be calculated at each time-step, thus reducing both the computation time in producing the kernel elements and also the complexity of the FCM algorithm. Empirical results show that stKFCM, even with relatively very small n, can provide clustering performance as accurately as kernel fuzzy c-means run on the entire data set while achieving a significant speedup.
Resumo:
Planning, navigation, and search are fundamental human cognitive abilities central to spatial problem solving in search and rescue, law enforcement, and military operations. Despite a wealth of literature concerning naturalistic spatial problem solving in animals, literature on naturalistic spatial problem solving in humans is comparatively lacking and generally conducted by separate camps among which there is little crosstalk. Addressing this deficiency will allow us to predict spatial decision making in operational environments, and understand the factors leading to those decisions. The present dissertation is comprised of two related efforts, (1) a set of empirical research studies intended to identify characteristics of planning, execution, and memory in naturalistic spatial problem solving tasks, and (2) a computational modeling effort to develop a model of naturalistic spatial problem solving. The results of the behavioral studies indicate that problem space hierarchical representations are linear in shape, and that human solutions are produced according to multiple optimization criteria. The Mixed Criteria Model presented in this dissertation accounts for global and local human performance in a traditional and naturalistic Traveling Salesman Problem. The results of the empirical and modeling efforts hold implications for basic and applied science in domains such as problem solving, operations research, human-computer interaction, and artificial intelligence.
Resumo:
High concentrations of fluoride naturally occurring in the ground water in the Arusha region of Tanzania cause dental, skeletal and non-skeletal fluorosis in up to 90% of the region’s population [1]. Symptoms of this incurable but completely preventable disease include brittle, discolored teeth, malformed bones and stiff and swollen joints. The consumption of high fluoride water has also been proven to cause headaches and insomnia [2] and adversely affect the development of children’s intelligence [3, 4]. Despite the fact that this array of symptoms may significantly impact a society’s development and the citizens’ ability to perform work and enjoy a reasonable quality of life, little is offered in the Arusha region in the form of solutions for the poor, those hardest hit by the problem. Multiple defluoridation technologies do exist, yet none are successfully reaching the Tanzanian public. This report takes a closer look at the efforts of one local organization, the Defluoridation Technology Project (DTP), to address the region’s fluorosis problem through the production and dissemination of bone char defluoridation filters, an appropriate technology solution that is proven to work. The goal of this research is to improve the sustainability of DTP’s operations and help them reach a wider range of clients so that they may reduce the occurrence of fluorosis more effectively. This was done first through laboratory testing of current products. Results of this testing show a wide range in uptake capacity across batches of bone char emphasizing the need to modify kiln design in order to produce a more consistent and high quality product. The issue of filter dissemination was addressed through the development of a multi-level, customerfunded business model promoting the availability of filters to Tanzanians of all socioeconomic levels. Central to this model is the recommendation to focus on community managed, institutional sized filters in order to make fluoride free water available to lower income clients and to increase Tanzanian involvement at the management level.
Resumo:
Sensor networks have been an active research area in the past decade due to the variety of their applications. Many research studies have been conducted to solve the problems underlying the middleware services of sensor networks, such as self-deployment, self-localization, and synchronization. With the provided middleware services, sensor networks have grown into a mature technology to be used as a detection and surveillance paradigm for many real-world applications. The individual sensors are small in size. Thus, they can be deployed in areas with limited space to make unobstructed measurements in locations where the traditional centralized systems would have trouble to reach. However, there are a few physical limitations to sensor networks, which can prevent sensors from performing at their maximum potential. Individual sensors have limited power supply, the wireless band can get very cluttered when multiple sensors try to transmit at the same time. Furthermore, the individual sensors have limited communication range, so the network may not have a 1-hop communication topology and routing can be a problem in many cases. Carefully designed algorithms can alleviate the physical limitations of sensor networks, and allow them to be utilized to their full potential. Graphical models are an intuitive choice for designing sensor network algorithms. This thesis focuses on a classic application in sensor networks, detecting and tracking of targets. It develops feasible inference techniques for sensor networks using statistical graphical model inference, binary sensor detection, events isolation and dynamic clustering. The main strategy is to use only binary data for rough global inferences, and then dynamically form small scale clusters around the target for detailed computations. This framework is then extended to network topology manipulation, so that the framework developed can be applied to tracking in different network topology settings. Finally the system was tested in both simulation and real-world environments. The simulations were performed on various network topologies, from regularly distributed networks to randomly distributed networks. The results show that the algorithm performs well in randomly distributed networks, and hence requires minimum deployment effort. The experiments were carried out in both corridor and open space settings. A in-home falling detection system was simulated with real-world settings, it was setup with 30 bumblebee radars and 30 ultrasonic sensors driven by TI EZ430-RF2500 boards scanning a typical 800 sqft apartment. Bumblebee radars are calibrated to detect the falling of human body, and the two-tier tracking algorithm is used on the ultrasonic sensors to track the location of the elderly people.
Resumo:
This thesis attempts to find the least-cost strategy to reduce CO2 emission by replacing coal by other energy sources for electricity generation in the context of the proposed EPA’s regulation on CO2 emissions from existing coal-fired power plants. An ARIMA model is built to forecast coal consumption for electricity generation and its CO2 emissions in Michigan from 2016 to 2020. CO2 emission reduction costs are calculated under three emission reduction scenarios- reduction to 17%, 30% and 50% below the 2005 emission level. The impacts of Production Tax Credit (PTC) and the intermittency of renewable energy are also discussed. The results indicate that in most cases natural gas will be the best alternative to coal for electricity generation to realize CO2 reduction goals; if the PTC for wind power will continue after 2015, a natural gas and wind combination approach could be the best strategy based on the least-cost criterion.