923 resultados para accelerometer, randomness check
Resumo:
Shoulder joint is a complex integration of soft and hard tissues. It plays an important role in performing daily activities and can be considered as a perfect compromise between mobility and stability. However, shoulder is vulnerable to complications such as dislocations and osteoarthritis. Finite element (FE) models have been developed to understand shoulder injury mechanisms, implications of disease on shoulder complex and in assessing the quality of shoulder implants. Further, although few, Finite element shoulder models have also been utilized to answer important clinical questions such as the difference between a normal and osteoarthritic shoulder joint. However, due to the absence of experimental validation, it is questionable whether the constitutive models applied in these FE models are adequate to represent mechanical behaviors of shoulder elements (Cartilages, Ligaments, Muscles etc), therefore the confidence of using current models in answering clinically relevant question. The main objective of this review is to critically evaluate the existing FE shoulder models that have been used to investigate clinical problems. Due concern is given to check the adequacy of representative constitutive models of shoulder elements in drawing clinically relevant conclusion. Suggestions have been given to improve the existing shoulder models by inclusion of adequate constitutive models for shoulder elements to confidently answer clinically relevant questions.
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways from payment systems to assisting the lives of elderly or disabled people. Security threats for these devices become increasingly dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level. Therefore, third-party developers have the opportunity to develop kernel-based low-level security tools which is not normal for smartphone platforms. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS for example, holding the greatest market share among all smartphone OSs, was closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners� privacy. In this work, we present our current results in analyzing the security of Android smartphones with a focus on its Linux side. Our results are not limited to Android, they are also applicable to Linux-based smartphones such as OpenMoko Neo FreeRunner. Our contribution in this work is three-fold. First, we analyze android framework and the Linux-kernel to check security functionalities. We survey wellaccepted security mechanisms and tools which can increase device security. We provide descriptions on how to adopt these security tools on Android kernel, and provide their overhead analysis in terms of resource usage. As open smartphones are released and may increase their market share similar to Symbian, they may attract attention of malware writers. Therefore, our second contribution focuses on malware detection techniques at the kernel level. We test applicability of existing signature and intrusion detection methods in Android environment. We focus on monitoring events on the kernel; that is, identifying critical kernel, log file, file system and network activity events, and devising efficient mechanisms to monitor them in a resource limited environment. Our third contribution involves initial results of our malware detection mechanism basing on static function call analysis. We identified approximately 105 Executable and Linking Format (ELF) executables installed to the Linux side of Android. We perform a statistical analysis on the function calls used by these applications. The results of the analysis can be compared to newly installed applications for detecting significant differences. Additionally, certain function calls indicate malicious activity. Therefore, we present a simple decision tree for deciding the suspiciousness of the corresponding application. Our results present a first step towards detecting malicious applications on Android-based devices.
Resumo:
Driven by the rapid development of ubiquitous and pervasive computing, personalized services and applications are deployed to support our lives. Accordingly, the number of interfaces and devices (smartphone, tablet computer, etc.) provided to access and consume these services is growing continuously. To simplify the complexity of managing many accounts with different credentials, Single Sign-On (SSO) solutions have been introduced. However, a single password for many accounts represents a single-point-of-failure. Furthermore, once initiated SSO session is a high potential risk when the working station is left unlocked and unattended. In this paper, we present a conception of a Persistent Single Sign-On (PSSO) for ubiquitous home environments by involving the capabilities of Behavioral Biometrics to check the identity of the user continuously in an unobtrusive manner.
Resumo:
Background Overweight and obesity has become a serious public health problem in many parts of the world. Studies suggest that making small changes in daily activity levels such as “breaking-up” sedentary time (i.e., standing) may help mitigate the health risks of sedentary behavior. The aim of the present study was to examine time spent in standing (determined by count threshold), lying, and sitting postures (determined by inclinometer function) via the ActiGraph GT3X among sedentary adults with differing weight status based on body mass index (BMI) categories. Methods Participants included 22 sedentary adults (14 men, 8 women; mean age 26.5 ± 4.1 years). All subjects completed the self-report International Physical Activity Questionnaire to determine time spent sitting over the previous 7 days. Participants were included if they spent seven or more hours sitting per day. Postures were determined with the ActiGraph GT3X inclinometer function. Participants were instructed to wear the accelerometer for 7 consecutive days (24 h a day). BMI was categorized as: 18.5 to <25 kg/m2 as normal, 25 to <30 kg/m2 as overweight, and ≥30 kg/m2 as obese. Results Participants in the normal weight (n = 10) and overweight (n = 6) groups spent significantly more time standing (after adjustment for moderate-to-vigorous intensity physical activity and wear-time) (6.7 h and 7.3 h respectively) and less time sitting (7.1 h and 6.9 h respectively) than those in obese (n = 6) categories (5.5 h and 8.0 h respectively) after adjustment for wear-time (p < 0.001). There were no significant differences in standing and sitting time between normal weight and overweight groups (p = 0.051 and p = 0.670 respectively). Differences were not significant among groups for lying time (p = 0.55). Conclusion This study described postural allocations standing, lying, and sitting among normal weight, overweight, and obese sedentary adults. The results provide additional evidence for the use of increasing standing time in obesity prevention strategies.
Resumo:
A technologically innovative study was undertaken across two suburbs in Brisbane, Australia, to assess socioeconomic differences in women's use of the local environment for work, recreation, and physical activity. Mothers from high and low socioeconomic suburbs were instructed to continue with usual daily routines, and to use mobile phone applications (Facebook Places, Twitter, and Foursquare) on their mobile phones to ‘check-in’ at each location and destination they reached during a one-week period. These smartphone applications are able to track travel logistics via built-in geographical information systems (GIS), which record participants’ points of latitude and longitude at each destination they reach. Location data were downloaded to Google Earth and excel for analysis. Women provided additional qualitative data via text regarding the reasons and social contexts of their travel. We analysed 2183 ‘check-ins’ for 54 women in this pilot study to gain quantitative, qualitative, and spatial data on human-environment interactions. Data was gathered on distances travelled, mode of transport, reason for travel, social context of travel, and categorised in terms of physical activity type – walking, running, sports, gym, cycling, or playing in the park. We found that the women in both suburbs had similar daily routines with the exception of physical activity. We identified 15% of ‘check-ins’ in the lower socioeconomic group as qualifying for the physical activity category, compared with 23% in the higher socioeconomic group. This was explained by more daily walking for transport (1.7kms to 0.2kms) and less car travel each week (28.km to 48.4kms) in the higher socioeconomic suburb. We ascertained insights regarding the socio-cultural influences on these differences via additional qualitative data. We discuss the benefits and limitations of using new technologies and Google Earth with implications for informing future physical and social aspects of urban design, and health promotion in socioeconomically diverse cities.
Resumo:
The main objective of this paper is to describe the development of a remote sensing airborne air sampling system for Unmanned Aerial Systems (UAS) and provide the capability for the detection of particle and gas concentrations in real time over remote locations. The design of the air sampling methodology started by defining system architecture, and then by selecting and integrating each subsystem. A multifunctional air sampling instrument, with capability for simultaneous measurement of particle and gas concentrations was modified and integrated with ARCAA’s Flamingo UAS platform and communications protocols. As result of the integration process, a system capable of both real time geo-location monitoring and indexed-link sampling was obtained. Wind tunnel tests were conducted in order to evaluate the performance of the air sampling instrument in controlled nonstationary conditions at the typical operational velocities of the UAS platform. Once the remote fully operative air sampling system was obtained, the problem of mission design was analyzed through the simulation of different scenarios. Furthermore, flight tests of the complete air sampling system were then conducted to check the dynamic characteristics of the UAS with the air sampling system and to prove its capability to perform an air sampling mission following a specific flight path.
Resumo:
The health impacts of exposure to ambient temperature have been drawing increasing attention from the environmental health research community, government, society, industries, and the public. Case-crossover and time series models are most commonly used to examine the effects of ambient temperature on mortality. However, some key methodological issues remain to be addressed. For example, few studies have used spatiotemporal models to assess the effects of spatial temperatures on mortality. Few studies have used a case-crossover design to examine the delayed (distributed lag) and non-linear relationship between temperature and mortality. Also, little evidence is available on the effects of temperature changes on mortality, and on differences in heat-related mortality over time. This thesis aimed to address the following research questions: 1. How to combine case-crossover design and distributed lag non-linear models? 2. Is there any significant difference in effect estimates between time series and spatiotemporal models? 3. How to assess the effects of temperature changes between neighbouring days on mortality? 4. Is there any change in temperature effects on mortality over time? To combine the case-crossover design and distributed lag non-linear model, datasets including deaths, and weather conditions (minimum temperature, mean temperature, maximum temperature, and relative humidity), and air pollution were acquired from Tianjin China, for the years 2005 to 2007. I demonstrated how to combine the case-crossover design with a distributed lag non-linear model. This allows the case-crossover design to estimate the non-linear and delayed effects of temperature whilst controlling for seasonality. There was consistent U-shaped relationship between temperature and mortality. Cold effects were delayed by 3 days, and persisted for 10 days. Hot effects were acute and lasted for three days, and were followed by mortality displacement for non-accidental, cardiopulmonary, and cardiovascular deaths. Mean temperature was a better predictor of mortality (based on model fit) than maximum or minimum temperature. It is still unclear whether spatiotemporal models using spatial temperature exposure produce better estimates of mortality risk compared with time series models that use a single site’s temperature or averaged temperature from a network of sites. Daily mortality data were obtained from 163 locations across Brisbane city, Australia from 2000 to 2004. Ordinary kriging was used to interpolate spatial temperatures across the city based on 19 monitoring sites. A spatiotemporal model was used to examine the impact of spatial temperature on mortality. A time series model was used to assess the effects of single site’s temperature, and averaged temperature from 3 monitoring sites on mortality. Squared Pearson scaled residuals were used to check the model fit. The results of this study show that even though spatiotemporal models gave a better model fit than time series models, spatiotemporal and time series models gave similar effect estimates. Time series analyses using temperature recorded from a single monitoring site or average temperature of multiple sites were equally good at estimating the association between temperature and mortality as compared with a spatiotemporal model. A time series Poisson regression model was used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. Temperature change was calculated by the current day's mean temperature minus the previous day's mean. In Brisbane, a drop of more than 3 �C in temperature between days was associated with relative risks (RRs) of 1.16 (95% confidence interval (CI): 1.02, 1.31) for non-external mortality (NEM), 1.19 (95% CI: 1.00, 1.41) for NEM in females, and 1.44 (95% CI: 1.10, 1.89) for NEM aged 65.74 years. An increase of more than 3 �C was associated with RRs of 1.35 (95% CI: 1.03, 1.77) for cardiovascular mortality and 1.67 (95% CI: 1.15, 2.43) for people aged < 65 years. In Los Angeles, only a drop of more than 3 �C was significantly associated with RRs of 1.13 (95% CI: 1.05, 1.22) for total NEM, 1.25 (95% CI: 1.13, 1.39) for cardiovascular mortality, and 1.25 (95% CI: 1.14, 1.39) for people aged . 75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. A change in temperature of more than 3 �C, whether positive or negative, has an adverse impact on mortality even after controlling for mean temperature. I examined the variation in the effects of high temperatures on elderly mortality (age . 75 years) by year, city and region for 83 large US cities between 1987 and 2000. High temperature days were defined as two or more consecutive days with temperatures above the 90th percentile for each city during each warm season (May 1 to September 30). The mortality risk for high temperatures was decomposed into: a "main effect" due to high temperatures using a distributed lag non-linear function, and an "added effect" due to consecutive high temperature days. I pooled yearly effects across regions and overall effects at both regional and national levels. The effects of high temperature (both main and added effects) on elderly mortality varied greatly by year, city and region. The years with higher heat-related mortality were often followed by those with relatively lower mortality. Understanding this variability in the effects of high temperatures is important for the development of heat-warning systems. In conclusion, this thesis makes contribution in several aspects. Case-crossover design was combined with distribute lag non-linear model to assess the effects of temperature on mortality in Tianjin. This makes the case-crossover design flexibly estimate the non-linear and delayed effects of temperature. Both extreme cold and high temperatures increased the risk of mortality in Tianjin. Time series model using single site’s temperature or averaged temperature from some sites can be used to examine the effects of temperature on mortality. Temperature change (no matter significant temperature drop or great temperature increase) increases the risk of mortality. The high temperature effect on mortality is highly variable from year to year.
Resumo:
Accuracy of dose delivery in external beam radiotherapy is usually verified with electronic portal imaging (EPI) in which the treatment beam is used to check the positioning of the patient. However the resulting megavoltage x-ray images suffer from poor quality. The image quality can be improved by developing a special operating mode in the linear accelerator. The existing treatment beam is modified such that it produces enough low-energy photons for imaging. In this work the problem of optimizing the beam/detector combination to achieve optimal electronic portal image quality is addressed. The linac used for this study was modified to produce two experimental photon beams. These beams, named Al6 and Al10, were non-flat and were produced by 4MeV electrons hitting aluminum targets, 6 and 10mm thick respectively. The images produced by a conventional EPI system (6MV treatment beam and camera-based EPID with a Cu plate & Gd2O2S screen ) were compared with the images produced by the experimental beams and various screens with the same camera). The contrast of 0.8cm bone equivalent material in 5 cm water increased from 1.5% for the conventional system to 11% for the combination of Al6 beam with a 200mg/cm2 Gd2O2S screen. The signal-to-noise ratio calculated for 1cGy flood field images increased by about a factor of two for the same EPI systems. The spatial resolution of the two imaging systems was comparable. This work demonstrates that significant improvements in portal image contrast can be obtained by simultaneous optimization of the linac spectrum and EPI detector.
Resumo:
We define a pair-correlation function that can be used to characterize spatiotemporal patterning in experimental images and snapshots from discrete simulations. Unlike previous pair-correlation functions, the pair-correlation functions developed here depend on the location and size of objects. The pair-correlation function can be used to indicate complete spatial randomness, aggregation or segregation over a range of length scales, and quantifies spatial structures such as the shape, size and distribution of clusters. Comparing pair-correlation data for various experimental and simulation images illustrates their potential use as a summary statistic for calibrating discrete models of various physical processes.
Resumo:
Public libraries and coworking spaces seek for means to facilitate peer collaboration, peer inspiration and cross-pollination of skills and creativity. However, social learning, inspiration and collaboration between coworkers do not come naturally. In particular in (semi-) public spaces, the behavioural norm among unacquainted coworkers is to work in individual silos without taking advantage of social learning or collaboration opportunities. This paper presents results from a pilot study of ‘Gelatine’ – a system that facilitates shared encounters between coworkers by allowing them to digitally ‘check in’ at a work space. Gelatine displays skills, areas of interest, and needs of currently present coworkers on a public screen. The results indicate that the system amplifies users’ sense of place and awareness of other coworkers, and serves as an interface for social learning through exploratory, opportunistic and serendipitous inspirations, as well as through helping users identify like-minded peers for follow-up face-to-face encounters. We discuss how Gelatine is perceived by users with different pre-entry motivations, and discuss users’ challenges as well as non-use of the system.
Resumo:
Oceania has a relatively low level of crime prevalence yet in the smaller and under-developed PICs we have shown that transnational crime has become increasingly common. A risk contained but potentially dangerous if state failure or fragility undermines law enforcement capacities. We predict that as the pace of globalization quickens and the demand for raw materials and resources grows some parts of the Pacific will be prone to criminal enterprises run by both indigenous and foreign crime groups. Australia and New Zealand will remain attractors of illicit goods notably ATS but will in turn be source countries for diminishing fish stock such as beche de mere and abalone as well forest timber. Finally the role of states such as Australia and New Zealand in helping to maintain law enforcement capacities throughout the region will be crucial if organized crime in Oceania is to be kept in check while demand for illicit resources grow.
Resumo:
Oceania has a relatively low level of crime prevalence yet in the smaller and under-developed PICs we have shown that transnational crime has become increasingly common. A risk contained but potentially dangerous if state failure or fragility undermines law enforcement capacities. We predict that as the pace of globalization quickens and the demand for raw materials and resources grows some parts of the Pacific will be prone to criminal enterprises run by both indigenous and foreign crime groups. Australia and New Zealand will remain attractors of illicit goods notably ATS but will in turn be source countries for diminishing fish stock such as beche de mere and abalone as well forest timber. Finally the role of states such as Australia and New Zealand in helping to maintain law enforcement capacities throughout the region will be crucial if organized crime in Oceania is to be kept in check while demand for illicit resources grow.
Resumo:
This paper presents research findings and design strategies that illustrate how digital technology can be applied as a tool for hybrid placemaking in ways that would not be possible in purely digital or physical space. Digital technology has revolutionised the way people learn and gather new information. This trend has challenged the role of the library as a physical place, as well as the interplay of digital and physical aspects of the library. The paper provides an overview of how the penetration of digital technology into everyday life has affected the library as a place, both as designed by place makers, and, as perceived by library users. It then identifies a gap in current library research about the use of digital technology as a tool for placemaking, and reports results from a study of Gelatine – a custom built user check-in system that displays real-time user information on a set of public screens. Gelatine and its evaluation at The Edge, at State Library of Queensland illustrates how combining affordances of social, spatial and digital space can improve the connected learning experience among on-site visitors. Future design strategies involving gamifying the user experience in libraries are described based on Gelatine’s infrastructure. The presented design ideas and concepts are relevant for managers and designers of libraries as well as other informal, social learning environments.
Resumo:
Operational modal analysis (OMA) is prevalent in modal identifi cation of civil structures. It asks for response measurements of the underlying structure under ambient loads. A valid OMA method requires the excitation be white noise in time and space. Although there are numerous applications of OMA in the literature, few have investigated the statistical distribution of a measurement and the infl uence of such randomness to modal identifi cation. This research has attempted modifi ed kurtosis to evaluate the statistical distribution of raw measurement data. In addition, a windowing strategy employing this index has been proposed to select quality datasets. In order to demonstrate how the data selection strategy works, the ambient vibration measurements of a laboratory bridge model and a real cable-stayed bridge have been respectively considered. The analysis incorporated with frequency domain decomposition (FDD) as the target OMA approach for modal identifi cation. The modal identifi cation results using the data segments with different randomness have been compared. The discrepancy in FDD spectra of the results indicates that, in order to fulfi l the assumption of an OMA method, special care shall be taken in processing a long vibration measurement data. The proposed data selection strategy is easy-to-apply and verifi ed effective in modal analysis.
Resumo:
Passengers navigating through airports can experience confusion or become lost, resulting in dissatisfaction, missed flights and flight delays. Passengers moving through airports are required to make many navigation decisions, for example to find the correct check-in desk or find the correct boarding gate. Prior experience of using the airports is likely to enable intuitive navigation, however limited research on this topic currently exists. In this paper we investigate passenger navigation by observing 30 participants at one international airport as they moved from check-in to a departure gate. The results indicate that passengers do spend time navigating intuitively through the airport, and that there is a positive correlation between intuitive navigation and airport familiarity. It was also found that participants with lower airport familiarity spend a greater percentage of overall navigation time searching and assessing/acquiring information than high familiarity participants. These findings provide evidence that passengers with higher airport familiarity have a greater understanding of the process, have a better understanding of what information to look for and use this familiarity to navigate intuitively. Findings from this research will have design implications for both current, and future airport terminals and other large spaces that people navigate through.