455 resultados para news frame analysis


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chromatographic fingerprints of 46 Eucommia Bark samples were obtained by liquid chromatography-diode array detector (LC-DAD). These samples were collected from eight provinces in China, with different geographical locations, and climates. Seven common LC peaks that could be used for fingerprinting this common popular traditional Chinese medicine were found, and six were identified as substituted resinols (4 compounds), geniposidic acid and chlorogenic acid by LC-MS. Principal components analysis (PCA) indicated that samples from the Sichuan, Hubei, Shanxi and Anhui—the SHSA provinces, clustered together. The other objects from the four provinces, Guizhou, Jiangxi, Gansu and Henan, were discriminated and widely scattered on the biplot in four province clusters. The SHSA provinces are geographically close together while the others are spread out. Thus, such results suggested that the composition of the Eucommia Bark samples was dependent on their geographic location and environment. In general, the basis for discrimination on the PCA biplot from the original 46 objects× 7 variables data matrix was the same as that for the SHSA subset (36 × 7 matrix). The seven marker compound loading vectors grouped into three sets: (1) three closely correlating substituted resinol compounds and chlorogenic acid; (2) the fourth resinol compound identified by the OCH3 substituent in the R4 position, and an unknown compound; and (3) the geniposidic acid, which was independent of the set 1 variables, and which negatively correlated with the set 2 ones above. These observations from the PCA biplot were supported by hierarchical cluster analysis, and indicated that Eucommia Bark preparations may be successfully compared with the use of the HPLC responses from the seven marker compounds and chemometric methods such as PCA and the complementary hierarchical cluster analysis (HCA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vehicle detectors have been installed at approximately every 300 meters on each lane on Tokyo metropolitan expressway. Various traffic data such as traffic volume, average speed and time occupancy are collected by vehicle detectors. We can understand traffic characteristics of every point by comparing traffic data collected at consecutive points. In this study, we focused on average speed, analyzed road potential by operating speed during free-flow conditions, and identified latent bottlenecks. Furthermore, we analyzed effects for road potential by the rainfall level and day of the week. It’s expected that this method of analysis will be utilized for installation of ITS such as drive assist, estimation of parameters for traffic simulation and feedback to road design as congestion measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"By understanding how places have evolved, we are better able to guide development and change in the urban fabric and avoid the incongruity created by so much of the modern environment" (MacCormac, R (1996), An anatomy of London, Built Environment, Dec 1996 This paper proposes a theory on the relevance of mapping the evolutionary aspects of historical urban form in order to develop a measure of evaluating architectural elements within urban forms, through to deriving parameters for new buildings. By adopting Conzen's identification of the tripartite division of urban form; the consonance inurban form of a particular palce resides in the elements and measurable values tha makeup the fine grain aggregates of urban form. The paper will demonstrate throughthe case study of Brisbane in Australia, a method of conveying these essential components that constitute a cities continuity of form and active usage. By presenting the past as a repository of urban form characteristics, it is argued that concise architectural responses that stem from such knowledge should result in an engaged urban landscape. The essential proposition is that urban morphology is a missing constituent in the process of urban design, and that the approach of the geographical discipline to the study of urban morphology holds the key to providing the evidence of urban growth characteristics, and this methodology suggests possibilities for an architectural approach that can comprehensively determine qualitative aspects of urban buildings. The relevance of this research lies in a potential to breach the limitations of current urban analysis whilst continuing the evolving currency of urban morphology as an integral practice in the design of our cities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective information and knowledge management (IKM) is critical to corporate success; yet, its actual establishment and management is not yet fully understood. We identify ten organizational elements that need to be addressed to ensure the effective implementation and maintenance of information and knowledge management within organizations. We define these elements and provide key characterizations. We then discuss a case study that describes the implementation of an information system (designed to support IKM) in a medical supplies organization. We apply the framework of organizational elements in our analysis to uncover the enablers and barriers in this systems implementation project. Our analysis suggests that taking the ten organizational elements into consideration when implementing information systems will assist practitioners in managing information and knowledge processes more effectively and efficiently. We discuss implications for future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Light Detection and Ranging (LIDAR) has great potential to assist vegetation management in power line corridors by providing more accurate geometric information of the power line assets and vegetation along the corridors. However, the development of algorithms for the automatic processing of LIDAR point cloud data, in particular for feature extraction and classification of raw point cloud data, is in still in its infancy. In this paper, we take advantage of LIDAR intensity and try to classify ground and non-ground points by statistically analyzing the skewness and kurtosis of the intensity data. Moreover, the Hough transform is employed to detected power lines from the filtered object points. The experimental results show the effectiveness of our methods and indicate that better results were obtained by using LIDAR intensity data than elevation data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims: To assess the effectiveness of current treatment approaches to assist benzodiazepine discontinuation. Methods: A systematic review of approaches to benzodiazepine discontinuation in general practice and out-patient settings was undertaken. Routine care was compared with three treatment approaches: brief interventions, gradual dose reduction (GDR) and psychological interventions. GDR was compared with GDR plus psychological interventions or substitutive pharmacotherapies. Results: Inclusion criteria were met by 24 studies, and a further eight were identified by future search. GDR [odds ratio (OR) = 5.96, confidence interval (CI) = 2.08–17.11] and brief interventions (OR = 4.37, CI = 2.28–8.40) provided superior cessation rates at post-treatment to routine care. Psychological treatment plus GDR were superior to both routine care (OR = 3.38, CI = 1.86–6.12) and GDR alone (OR = 1.82, CI = 1.25–2.67). However, substitutive pharmacotherapies did not add to the impact of GDR (OR = 1.30, CI = 0.97– 1.73), and abrupt substitution of benzodiazepines by other pharmacotherapy was less effective than GDR alone (OR = 0.30, CI = 0.14–0.64). Few studies on any technique had significantly greater benzodiazepine discontinuation than controls at follow-up. Conclusions: Providing an intervention is more effective than routine care. Psychological interventions may improve discontinuation above GDR alone. While some substitutive pharmacotherapies may have promise, current evidence is insufficient to support their use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A diagnosis of cancer represents a significant crisis for the child and their family. As the treatment for childhood cancer has improved dramatically over the past three decades, most children diagnosed with cancer today survive this illness. However, it is still an illness which severely disrupts the lifestyle and typical functioning of the family unit. Most treatments for cancer involve lengthy hospital stays, the endurance of painful procedures and harsh side effects. Research has confirmed that to manage and adapt to such a crisis, families must undertake measures which assist their adjustment. Variables such as level of family support, quality of parents’ marital relationship, coping of other family members, lack of other concurrent stresses and open communication within the family have been identified as influences on how well families adjust to a diagnosis of childhood cancer. Theoretical frameworks such as the Resiliency Model of Family Adjustment and Adaptation (McCubbin and McCubbin, 1993, 1996) and the Stress and Coping Model by Lazarus and Folkman (1984) have been used to explain how families and individuals adapt to crises or adverse circumstances. Developmental theories have also been posed to account for how children come to understand and learn about the concept of illness. However more descriptive information about how families and children in particular, experience and manage a diagnosis of cancer is still needed. There are still many unanswered questions surrounding how a child adapts to, understands and makes meaning from having a life-threatening illness. As a result, developing an understanding of the impact that such a serious illness has on the child and their family is crucial. A new approach to examining childhood illness such as cancer is currently underway which allows for a greater understanding of the experience of childhood cancer to be achieved. This new approach invites a phenomenological method to investigate the perspectives of those affected by childhood cancer. In the current study 9 families in which there was a diagnosis of childhood cancer were interviewed twice over a 12 month period. Using the qualitative methodology of Interpretative Phenomenological Analysis (IPA) a semi-structured interview was used to explicate the experience of childhood cancer from both the parent and child’s perspectives. A number of quantitative measures were also administered to gather specific information on the demographics of the sample population. The results of this study revealed a number of pertinent areas which need to be considered when treating such families. More importantly experiences were explicated which revealed vital phenomena that needs to be added to extend current theoretical frameworks. Parents identified the time of the diagnosis as the hardest part of their entire experience. Parents experienced an internal struggle when they were forced to come to the realization that they were not able to help their child get well. Families demonstrated an enormous ability to develop a new lifestyle which accommodated the needs of the sick child, as the sick child became the focus of their lives. Regarding the children, many of them accepted their diagnosis without complaint or question, and they were able to recognise and appreciate the support they received. Physical pain was definitely a component of the children’s experience however the emotional strain of loss of peer contact seemed just as severe. Changes over time were also noted as both parental and child experiences were often pertinent to the stage of treatment the child had reached. The approach used in this study allowed for rich and intimate detail about a sensitive issue to be revealed. Such an approach also allowed for the experience of childhood cancer on parents and the children to be more fully realised. Only now can a comprehensive and sensitive medical and psychosocial approach to the child and family be developed. For example, families may benefit from extra support at the time of diagnosis as this was identified as one of the most difficult periods. Parents may also require counselling support in coming to terms with their lack of ability to help their child heal. Given the ease at which children accepted their diagnosis, we need to question whether children are more receptive to adversity. Yet the emotional struggle children battled as a result of their illness also needs to be addressed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bone mineral density (BMD) is currently the preferred surrogate for bone strength in clinical practice. Finite element analysis (FEA) is a computer simulation technique that can predict the deformation of a structure when a load is applied, providing a measure of stiffness (N mm− 1). Finite element analysis of X-ray images (3D-FEXI) is a FEA technique whose analysis is derived from a single 2D radiographic image. This ex-vivo study demonstrates that 3D-FEXI derived from a conventional 2D radiographic image has the potential to significantly increase the accuracy of failure load assessment of the proximal femur compared with that currently achieved with BMD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Public key cryptography, and with it,the ability to compute digital signatures, have made it possible for electronic commerce to flourish. It is thus unsurprising that the proposed Australian NECS will also utilise digital signatures in its system so as to provide a fully automated process from the creation of electronic land title instrument to the digital signing, and electronic lodgment of these instruments. This necessitates an analysis of the fraud risks raised by the usage of digital signatures because a compromise of the integrity of digital signatures will lead to a compromise of the Torrens system itself. This article will show that digital signatures may in fact offer greater security against fraud than handwritten signatures; but to achieve this, digital signatures require an infrastructure whereby each component is properly implemented and managed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brief overview of topics/issues of interest end of 2009, including Spatial Science Students undertake Variety of Research Projects; labs and offices on the move again); Congratulations to Surveying Student Project- QSEA awards.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates the problem of robot navigation using only landmark bearings. The proposed system allows a robot to move to a ground target location specified by the sensor values observed at this ground target posi- tion. The control actions are computed based on the difference between the current landmark bearings and the target landmark bearings. No Cartesian coordinates with respect to the ground are computed by the control system. The robot navigates using solely information from the bearing sensor space. Most existing robot navigation systems require a ground frame (2D Cartesian coordinate system) in order to navigate from a ground point A to a ground point B. The commonly used sensors such as laser range scanner, sonar, infrared, and vision do not directly provide the 2D ground coordi- nates of the robot. The existing systems use the sensor measurements to localise the robot with respect to a map, a set of 2D coordinates of the objects of interest. It is more natural to navigate between the points in the sensor space corresponding to A and B without requiring the Cartesian map and the localisation process. Research on animals has revealed how insects are able to exploit very limited computational and memory resources to successfully navigate to a desired destination without computing Cartesian positions. For example, a honeybee balances the left and right optical flows to navigate in a nar- row corridor. Unlike many other ants, Cataglyphis bicolor does not secrete pheromone trails in order to find its way home but instead uses the sun as a compass to keep track of its home direction vector. The home vector can be inaccurate, so the ant also uses landmark recognition. More precisely, it takes snapshots and compass headings of some landmarks. To return home, the ant tries to line up the landmarks exactly as they were before it started wandering. This thesis introduces a navigation method based on reflex actions in sensor space. The sensor vector is made of the bearings of some landmarks, and the reflex action is a gradient descent with respect to the distance in sensor space between the current sensor vector and the target sensor vec- tor. Our theoretical analysis shows that except for some fully characterized pathological cases, any point is reachable from any other point by reflex action in the bearing sensor space provided the environment contains three landmarks and is free of obstacles. The trajectories of a robot using reflex navigation, like other image- based visual control strategies, do not correspond necessarily to the shortest paths on the ground, because the sensor error is minimized, not the moving distance on the ground. However, we show that the use of a sequence of waypoints in sensor space can address this problem. In order to identify relevant waypoints, we train a Self Organising Map (SOM) from a set of observations uniformly distributed with respect to the ground. This SOM provides a sense of location to the robot, and allows a form of path planning in sensor space. The navigation proposed system is analysed theoretically, and evaluated both in simulation and with experiments on a real robot.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Surveillance networks are typically monitored by a few people, viewing several monitors displaying the camera feeds. It is then very difficult for a human operator to effectively detect events as they happen. Recently, computer vision research has begun to address ways to automatically process some of this data, to assist human operators. Object tracking, event recognition, crowd analysis and human identification at a distance are being pursued as a means to aid human operators and improve the security of areas such as transport hubs. The task of object tracking is key to the effective use of more advanced technologies. To recognize an event people and objects must be tracked. Tracking also enhances the performance of tasks such as crowd analysis or human identification. Before an object can be tracked, it must be detected. Motion segmentation techniques, widely employed in tracking systems, produce a binary image in which objects can be located. However, these techniques are prone to errors caused by shadows and lighting changes. Detection routines often fail, either due to erroneous motion caused by noise and lighting effects, or due to the detection routines being unable to split occluded regions into their component objects. Particle filters can be used as a self contained tracking system, and make it unnecessary for the task of detection to be carried out separately except for an initial (often manual) detection to initialise the filter. Particle filters use one or more extracted features to evaluate the likelihood of an object existing at a given point each frame. Such systems however do not easily allow for multiple objects to be tracked robustly, and do not explicitly maintain the identity of tracked objects. This dissertation investigates improvements to the performance of object tracking algorithms through improved motion segmentation and the use of a particle filter. A novel hybrid motion segmentation / optical flow algorithm, capable of simultaneously extracting multiple layers of foreground and optical flow in surveillance video frames is proposed. The algorithm is shown to perform well in the presence of adverse lighting conditions, and the optical flow is capable of extracting a moving object. The proposed algorithm is integrated within a tracking system and evaluated using the ETISEO (Evaluation du Traitement et de lInterpretation de Sequences vidEO - Evaluation for video understanding) database, and significant improvement in detection and tracking performance is demonstrated when compared to a baseline system. A Scalable Condensation Filter (SCF), a particle filter designed to work within an existing tracking system, is also developed. The creation and deletion of modes and maintenance of identity is handled by the underlying tracking system; and the tracking system is able to benefit from the improved performance in uncertain conditions arising from occlusion and noise provided by a particle filter. The system is evaluated using the ETISEO database. The dissertation then investigates fusion schemes for multi-spectral tracking systems. Four fusion schemes for combining a thermal and visual colour modality are evaluated using the OTCBVS (Object Tracking and Classification in and Beyond the Visible Spectrum) database. It is shown that a middle fusion scheme yields the best results and demonstrates a significant improvement in performance when compared to a system using either mode individually. Findings from the thesis contribute to improve the performance of semi-automated video processing and therefore improve security in areas under surveillance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Controlled rate thermal analysis (CRTA) technology offers better resolution and a more detailed interpretation of the decomposition processes of a clay mineral such as sepiolite via approaching equilibrium conditions of decomposition through the elimination of the slow transfer of heat to the sample as a controlling parameter on the process of decomposition. Constant-rate decomposition processes of non-isothermal nature reveal changes in the sepiolite as the sepiolite is converted to an anhydride. In the dynamic experiment two dehydration steps are observed over the *20–170 and 170–350 �C temperature range. In the dynamic experiment three dehydroxylation steps are observed over the temperature ranges 201–337, 337–638 and 638–982 �C. The CRTA technology enables the separation of the thermal decomposition steps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study analyses the Inclusive Education Statement – 2005, Education Queensland. (Appendix 1). The Statement was a product of the Queensland State Government response to Federal Legislation. The Federal Disability Discrimination Act (DDA), 1992 and the subsequent Standards for Education 2005, sought to eliminate discrimination against people with disabilities. Under Section 22 of the Act, it became unlawful for an educational authority to discriminate against a person on the grounds of the person’s disability.