939 resultados para scalable analysis
Resumo:
In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.
Resumo:
Chromatographic fingerprints of 46 Eucommia Bark samples were obtained by liquid chromatography-diode array detector (LC-DAD). These samples were collected from eight provinces in China, with different geographical locations, and climates. Seven common LC peaks that could be used for fingerprinting this common popular traditional Chinese medicine were found, and six were identified as substituted resinols (4 compounds), geniposidic acid and chlorogenic acid by LC-MS. Principal components analysis (PCA) indicated that samples from the Sichuan, Hubei, Shanxi and Anhui—the SHSA provinces, clustered together. The other objects from the four provinces, Guizhou, Jiangxi, Gansu and Henan, were discriminated and widely scattered on the biplot in four province clusters. The SHSA provinces are geographically close together while the others are spread out. Thus, such results suggested that the composition of the Eucommia Bark samples was dependent on their geographic location and environment. In general, the basis for discrimination on the PCA biplot from the original 46 objects× 7 variables data matrix was the same as that for the SHSA subset (36 × 7 matrix). The seven marker compound loading vectors grouped into three sets: (1) three closely correlating substituted resinol compounds and chlorogenic acid; (2) the fourth resinol compound identified by the OCH3 substituent in the R4 position, and an unknown compound; and (3) the geniposidic acid, which was independent of the set 1 variables, and which negatively correlated with the set 2 ones above. These observations from the PCA biplot were supported by hierarchical cluster analysis, and indicated that Eucommia Bark preparations may be successfully compared with the use of the HPLC responses from the seven marker compounds and chemometric methods such as PCA and the complementary hierarchical cluster analysis (HCA).
Resumo:
Vehicle detectors have been installed at approximately every 300 meters on each lane on Tokyo metropolitan expressway. Various traffic data such as traffic volume, average speed and time occupancy are collected by vehicle detectors. We can understand traffic characteristics of every point by comparing traffic data collected at consecutive points. In this study, we focused on average speed, analyzed road potential by operating speed during free-flow conditions, and identified latent bottlenecks. Furthermore, we analyzed effects for road potential by the rainfall level and day of the week. It’s expected that this method of analysis will be utilized for installation of ITS such as drive assist, estimation of parameters for traffic simulation and feedback to road design as congestion measures.
Resumo:
"By understanding how places have evolved, we are better able to guide development and change in the urban fabric and avoid the incongruity created by so much of the modern environment" (MacCormac, R (1996), An anatomy of London, Built Environment, Dec 1996 This paper proposes a theory on the relevance of mapping the evolutionary aspects of historical urban form in order to develop a measure of evaluating architectural elements within urban forms, through to deriving parameters for new buildings. By adopting Conzen's identification of the tripartite division of urban form; the consonance inurban form of a particular palce resides in the elements and measurable values tha makeup the fine grain aggregates of urban form. The paper will demonstrate throughthe case study of Brisbane in Australia, a method of conveying these essential components that constitute a cities continuity of form and active usage. By presenting the past as a repository of urban form characteristics, it is argued that concise architectural responses that stem from such knowledge should result in an engaged urban landscape. The essential proposition is that urban morphology is a missing constituent in the process of urban design, and that the approach of the geographical discipline to the study of urban morphology holds the key to providing the evidence of urban growth characteristics, and this methodology suggests possibilities for an architectural approach that can comprehensively determine qualitative aspects of urban buildings. The relevance of this research lies in a potential to breach the limitations of current urban analysis whilst continuing the evolving currency of urban morphology as an integral practice in the design of our cities.
Resumo:
Light Detection and Ranging (LIDAR) has great potential to assist vegetation management in power line corridors by providing more accurate geometric information of the power line assets and vegetation along the corridors. However, the development of algorithms for the automatic processing of LIDAR point cloud data, in particular for feature extraction and classification of raw point cloud data, is in still in its infancy. In this paper, we take advantage of LIDAR intensity and try to classify ground and non-ground points by statistically analyzing the skewness and kurtosis of the intensity data. Moreover, the Hough transform is employed to detected power lines from the filtered object points. The experimental results show the effectiveness of our methods and indicate that better results were obtained by using LIDAR intensity data than elevation data.
Resumo:
Aims: To assess the effectiveness of current treatment approaches to assist benzodiazepine discontinuation. Methods: A systematic review of approaches to benzodiazepine discontinuation in general practice and out-patient settings was undertaken. Routine care was compared with three treatment approaches: brief interventions, gradual dose reduction (GDR) and psychological interventions. GDR was compared with GDR plus psychological interventions or substitutive pharmacotherapies. Results: Inclusion criteria were met by 24 studies, and a further eight were identified by future search. GDR [odds ratio (OR) = 5.96, confidence interval (CI) = 2.08–17.11] and brief interventions (OR = 4.37, CI = 2.28–8.40) provided superior cessation rates at post-treatment to routine care. Psychological treatment plus GDR were superior to both routine care (OR = 3.38, CI = 1.86–6.12) and GDR alone (OR = 1.82, CI = 1.25–2.67). However, substitutive pharmacotherapies did not add to the impact of GDR (OR = 1.30, CI = 0.97– 1.73), and abrupt substitution of benzodiazepines by other pharmacotherapy was less effective than GDR alone (OR = 0.30, CI = 0.14–0.64). Few studies on any technique had significantly greater benzodiazepine discontinuation than controls at follow-up. Conclusions: Providing an intervention is more effective than routine care. Psychological interventions may improve discontinuation above GDR alone. While some substitutive pharmacotherapies may have promise, current evidence is insufficient to support their use.
Resumo:
A diagnosis of cancer represents a significant crisis for the child and their family. As the treatment for childhood cancer has improved dramatically over the past three decades, most children diagnosed with cancer today survive this illness. However, it is still an illness which severely disrupts the lifestyle and typical functioning of the family unit. Most treatments for cancer involve lengthy hospital stays, the endurance of painful procedures and harsh side effects. Research has confirmed that to manage and adapt to such a crisis, families must undertake measures which assist their adjustment. Variables such as level of family support, quality of parents’ marital relationship, coping of other family members, lack of other concurrent stresses and open communication within the family have been identified as influences on how well families adjust to a diagnosis of childhood cancer. Theoretical frameworks such as the Resiliency Model of Family Adjustment and Adaptation (McCubbin and McCubbin, 1993, 1996) and the Stress and Coping Model by Lazarus and Folkman (1984) have been used to explain how families and individuals adapt to crises or adverse circumstances. Developmental theories have also been posed to account for how children come to understand and learn about the concept of illness. However more descriptive information about how families and children in particular, experience and manage a diagnosis of cancer is still needed. There are still many unanswered questions surrounding how a child adapts to, understands and makes meaning from having a life-threatening illness. As a result, developing an understanding of the impact that such a serious illness has on the child and their family is crucial. A new approach to examining childhood illness such as cancer is currently underway which allows for a greater understanding of the experience of childhood cancer to be achieved. This new approach invites a phenomenological method to investigate the perspectives of those affected by childhood cancer. In the current study 9 families in which there was a diagnosis of childhood cancer were interviewed twice over a 12 month period. Using the qualitative methodology of Interpretative Phenomenological Analysis (IPA) a semi-structured interview was used to explicate the experience of childhood cancer from both the parent and child’s perspectives. A number of quantitative measures were also administered to gather specific information on the demographics of the sample population. The results of this study revealed a number of pertinent areas which need to be considered when treating such families. More importantly experiences were explicated which revealed vital phenomena that needs to be added to extend current theoretical frameworks. Parents identified the time of the diagnosis as the hardest part of their entire experience. Parents experienced an internal struggle when they were forced to come to the realization that they were not able to help their child get well. Families demonstrated an enormous ability to develop a new lifestyle which accommodated the needs of the sick child, as the sick child became the focus of their lives. Regarding the children, many of them accepted their diagnosis without complaint or question, and they were able to recognise and appreciate the support they received. Physical pain was definitely a component of the children’s experience however the emotional strain of loss of peer contact seemed just as severe. Changes over time were also noted as both parental and child experiences were often pertinent to the stage of treatment the child had reached. The approach used in this study allowed for rich and intimate detail about a sensitive issue to be revealed. Such an approach also allowed for the experience of childhood cancer on parents and the children to be more fully realised. Only now can a comprehensive and sensitive medical and psychosocial approach to the child and family be developed. For example, families may benefit from extra support at the time of diagnosis as this was identified as one of the most difficult periods. Parents may also require counselling support in coming to terms with their lack of ability to help their child heal. Given the ease at which children accepted their diagnosis, we need to question whether children are more receptive to adversity. Yet the emotional struggle children battled as a result of their illness also needs to be addressed.
Resumo:
Bone mineral density (BMD) is currently the preferred surrogate for bone strength in clinical practice. Finite element analysis (FEA) is a computer simulation technique that can predict the deformation of a structure when a load is applied, providing a measure of stiffness (N mm− 1). Finite element analysis of X-ray images (3D-FEXI) is a FEA technique whose analysis is derived from a single 2D radiographic image. This ex-vivo study demonstrates that 3D-FEXI derived from a conventional 2D radiographic image has the potential to significantly increase the accuracy of failure load assessment of the proximal femur compared with that currently achieved with BMD.
Resumo:
Homophobic hatred: these words summarise online commentary made by people in support of a school that banned gay students from taking their same sex partners to a school formal. With the growing popularity of online news sites, it seems appropriate to critically examine how these sites are becoming a new arena in which people can express personal opinions about controversial topics. While commentators equally expressed two dominant viewpoints about the school ban (homophobic hatred and human rights), this paper focuses on homophobic hatred as a discursive position and how the comments work to confirm the legitimacy of the schools’ decision. Drawing on the work of Foucault and others, the paper examines how the comments constitute certain types of subjectivity drawing on dominant ideas about what it means to be homophobic. The analysis demonstrates the complex and competing skein of strategies that constitute queering school social spaces as a social problem.
Resumo:
Public key cryptography, and with it,the ability to compute digital signatures, have made it possible for electronic commerce to flourish. It is thus unsurprising that the proposed Australian NECS will also utilise digital signatures in its system so as to provide a fully automated process from the creation of electronic land title instrument to the digital signing, and electronic lodgment of these instruments. This necessitates an analysis of the fraud risks raised by the usage of digital signatures because a compromise of the integrity of digital signatures will lead to a compromise of the Torrens system itself. This article will show that digital signatures may in fact offer greater security against fraud than handwritten signatures; but to achieve this, digital signatures require an infrastructure whereby each component is properly implemented and managed.
Resumo:
Surveillance networks are typically monitored by a few people, viewing several monitors displaying the camera feeds. It is then very difficult for a human operator to effectively detect events as they happen. Recently, computer vision research has begun to address ways to automatically process some of this data, to assist human operators. Object tracking, event recognition, crowd analysis and human identification at a distance are being pursued as a means to aid human operators and improve the security of areas such as transport hubs. The task of object tracking is key to the effective use of more advanced technologies. To recognize an event people and objects must be tracked. Tracking also enhances the performance of tasks such as crowd analysis or human identification. Before an object can be tracked, it must be detected. Motion segmentation techniques, widely employed in tracking systems, produce a binary image in which objects can be located. However, these techniques are prone to errors caused by shadows and lighting changes. Detection routines often fail, either due to erroneous motion caused by noise and lighting effects, or due to the detection routines being unable to split occluded regions into their component objects. Particle filters can be used as a self contained tracking system, and make it unnecessary for the task of detection to be carried out separately except for an initial (often manual) detection to initialise the filter. Particle filters use one or more extracted features to evaluate the likelihood of an object existing at a given point each frame. Such systems however do not easily allow for multiple objects to be tracked robustly, and do not explicitly maintain the identity of tracked objects. This dissertation investigates improvements to the performance of object tracking algorithms through improved motion segmentation and the use of a particle filter. A novel hybrid motion segmentation / optical flow algorithm, capable of simultaneously extracting multiple layers of foreground and optical flow in surveillance video frames is proposed. The algorithm is shown to perform well in the presence of adverse lighting conditions, and the optical flow is capable of extracting a moving object. The proposed algorithm is integrated within a tracking system and evaluated using the ETISEO (Evaluation du Traitement et de lInterpretation de Sequences vidEO - Evaluation for video understanding) database, and significant improvement in detection and tracking performance is demonstrated when compared to a baseline system. A Scalable Condensation Filter (SCF), a particle filter designed to work within an existing tracking system, is also developed. The creation and deletion of modes and maintenance of identity is handled by the underlying tracking system; and the tracking system is able to benefit from the improved performance in uncertain conditions arising from occlusion and noise provided by a particle filter. The system is evaluated using the ETISEO database. The dissertation then investigates fusion schemes for multi-spectral tracking systems. Four fusion schemes for combining a thermal and visual colour modality are evaluated using the OTCBVS (Object Tracking and Classification in and Beyond the Visible Spectrum) database. It is shown that a middle fusion scheme yields the best results and demonstrates a significant improvement in performance when compared to a system using either mode individually. Findings from the thesis contribute to improve the performance of semi-automated video processing and therefore improve security in areas under surveillance.
Resumo:
Controlled rate thermal analysis (CRTA) technology offers better resolution and a more detailed interpretation of the decomposition processes of a clay mineral such as sepiolite via approaching equilibrium conditions of decomposition through the elimination of the slow transfer of heat to the sample as a controlling parameter on the process of decomposition. Constant-rate decomposition processes of non-isothermal nature reveal changes in the sepiolite as the sepiolite is converted to an anhydride. In the dynamic experiment two dehydration steps are observed over the *20–170 and 170–350 �C temperature range. In the dynamic experiment three dehydroxylation steps are observed over the temperature ranges 201–337, 337–638 and 638–982 �C. The CRTA technology enables the separation of the thermal decomposition steps.
Resumo:
This study analyses the Inclusive Education Statement – 2005, Education Queensland. (Appendix 1). The Statement was a product of the Queensland State Government response to Federal Legislation. The Federal Disability Discrimination Act (DDA), 1992 and the subsequent Standards for Education 2005, sought to eliminate discrimination against people with disabilities. Under Section 22 of the Act, it became unlawful for an educational authority to discriminate against a person on the grounds of the person’s disability.
Resumo:
The inherent uncertainty and complexity of construction work make construction planning a particularly difficult task for project managers due to the need to anticipate and visualize likely future events. Conventional computer-assisted technology can help but is often limited to the constructability issues involved. Virtual prototyping, however, offers an improved method through the visualization of construction activities by computer simulation — enabling a range of ‘what-if’ questions to be asked and their implications on the total project to be investigated. This paper describes the use of virtual prototyping to optimize construction planning schedules by analyzing resource allocation and planning with integrated construction models, resource models, construction planning schedules and site-layout plans. A real-life case study is presented that demonstrates the use of a virtual prototyping enabled resource analysis to reallocate space, logistic on access road and arrange tower cranes to achieve a 6-day floor construction cycle.