934 resultados para Latent fingerprint


Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Bryan v Maloney, the High Court extended a builder’s duty of care to encompass a liability in negligence for the pure economic loss sustained by a subsequent purchaser of a residential dwelling as a result of latent defects in the building’s construction. Recently, in Woolcock Street Investments Pty Ltd v CDG Pty Ltd, the Court refused to extend this liability to defects in commercial premises. The decision therefore provides an opportunity to re-examine the rationale and policy behind current jurisprudence governing builders’ liability for pure economic loss. In doing so, this article considers the principles relevant to the determination of a duty of care generally and whether the differences between purchasers of residential and commercial properties are as great as the case law suggests

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective The review addresses two distinct sets of issues: 1. specific functionality, interface, and calculation problems that presumably can be fixed or improved; and 2. the more fundamental question of whether the system is close to being ready for ‘commercial prime time’ in the North American market. Findings Many of our comments relate to the first set of issues, especially sections B and C. Sections D and E deal with the second set. Overall, we feel that LCADesign represents a very impressive step forward in the ongoing quest to link CAD with LCA tools and, more importantly, to link the world of architectural practice and that of environmental research. From that perspective, it deserves continued financial support as a research project. However, if the decision is whether or not to continue the development program from a purely commercial perspective, we are less bullish. In terms of the North American market, there are no regulatory or other drivers to press design teams to use a tool of this nature. There is certainly interest in this area, but the tools must be very easy to use with little or no training. Understanding the results is as important in this regard as knowing how to apply the tool. Our comments are fairly negative when it comes to that aspect. Our opinion might change to some degree when the ‘fixes’ are made and the functionality improved. However, as discussed in more detail in the following sections, we feel that the multi-step process — CAD to IFC to LCADesign — could pose a serious problem in terms of market acceptance. The CAD to IFC part is impossible for us to judge with the information provided, and we can’t even begin to answer the question about the ease of using the software to import designs, but it appears cumbersome from what we do know. There does appear to be a developing North American market for 3D CAD, with a recent survey indicating that about 50% of the firms use some form of 3D modeling for about 75% of their projects. However, this does not mean that full 3D CAD is always being used. Our information suggests that AutoDesk accounts for about 75 to 80% of the 3D CAD market, and they are very cautious about any links that do not serve a latent demand. Finally, other system that link CAD to energy simulation are using XML data transfer protocols rather than IFC files, and it is our understanding that the market served by AutoDesk tends in that direction right now. This is a subject that is outside our area of expertise, so please take these comments as suggestions for more intensive market research rather than as definitive findings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite changes in surgical techniques, radiotherapy targeting and the apparent earlier detection of cancers, secondary lymphoedema is still a significant problem for about 20–30% of those who receive treatment for cancer, although the incidence and prevalence does seem to be falling. The figures above generally relate to detection of an enlarged limb or other area, but it seems that about 60% of all patients also suffer other problems with how the limb feels, what can or cannot be done with it and a range of social or psychological issues. Often these ‘subjective’ changes occur before the objective ones, such as a change in arm volume or circumference. For most of those treated for cancer lymphoedema does not develop immediately, and, while about 60–70% develop it in the first few years, some do not develop lymphoedema for up to 15 or 20 years. Those who will develop clinically manifest lymphoedema in the future are, for some time, in a latent or hidden phase of lymphoedema. There also seems to be some risk factors which are indicators for a higher likelihood of lymphoedema post treatment, including oedema at the surgical site, arm dominance, age, skin conditions, and body mass index (BMI).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The molecular and metal profile fingerprints were obtained from a complex substance, Atractylis chinensis DC—a traditional Chinese medicine (TCM), with the use of the high performance liquid chromatography (HPLC) and inductively coupled plasma atomic emission spectroscopy (ICP-AES) techniques. This substance was used in this work as an example of a complex biological material, which has found application as a TCM. Such TCM samples are traditionally processed by the Bran, Cut, Fried and Swill methods, and were collected from five provinces in China. The data matrices obtained from the two types of analysis produced two principal component biplots, which showed that the HPLC fingerprint data were discriminated on the basis of the methods for processing the raw TCM, while the metal analysis grouped according to the geographical origin. When the two data matrices were combined into a one two-way matrix, the resulting biplot showed a clear separation on the basis of the HPLC fingerprints. Importantly, within each different grouping the objects separated according to their geographical origin, and they ranked approximately in the same order in each group. This result suggested that by using such an approach, it is possible to derive improved characterisation of the complex TCM materials on the basis of the two kinds of analytical data. In addition, two supervised pattern recognition methods, K-nearest neighbors (KNNs) method, and linear discriminant analysis (LDA), were successfully applied to the individual data matrices—thus, supporting the PCA approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vehicle detectors have been installed at approximately every 300 meters on each lane on Tokyo metropolitan expressway. Various traffic data such as traffic volume, average speed and time occupancy are collected by vehicle detectors. We can understand traffic characteristics of every point by comparing traffic data collected at consecutive points. In this study, we focused on average speed, analyzed road potential by operating speed during free-flow conditions, and identified latent bottlenecks. Furthermore, we analyzed effects for road potential by the rainfall level and day of the week. It’s expected that this method of analysis will be utilized for installation of ITS such as drive assist, estimation of parameters for traffic simulation and feedback to road design as congestion measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As user involvement becomes a necessary part of the product development process, various ways of accessing users' latent needs have been developed and studied. Reviews of literatures in user involvement and product development have revealed that accessing users' latent needs and transferring them into design process could be facilitated by effectively implementing user-designer collaboration during the early stage of the design process. In this paper, various types of user-designer collaboration were observed and then distinct characteristics of user-designer collaboration were classified into three categories. 1) Passive objectivity, 2) workplace democratisation, and 3) shared contexts were observed as strategies for better user-designer collaboration, which have been employed in the area of user-centred design, user participatory design and design for experiencing. Based on the literature review, this paper proposed a basic collaboration mechanism between the users and the designers during the early stage of the design process and then discussed how its mechanism will help to describe the interactions between the users and the designers during the user involvement sessions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Longitudinal Study of Australian Children (LSAC) is a major national study examining the lives of Australian children, using a cross-sequential cohort design and data from parents, children, and teachers for 5,107 infants (3–19 months) and 4,983 children (4–5 years). Its data are publicly accessible and are used by researchers from many disciplinary backgrounds. It contains multiple measures of children’s developmental outcomes as well as a broad range of information on the contexts of their lives. This paper reports on the development of summary outcome indices of child development using the LSAC data. The indices were developed to fill the need for indicators suitable for use by diverse data users in order to guide government policy and interventions which support young children’s optimal development. The concepts underpinning the indices and the methods of their development are presented. Two outcome indices (infant and child) were developed, each consisting of three domains—health and physical development, social and emotional functioning, and learning competency. A total of 16 measures are used to make up these three domains in the Outcome Index for the Child Cohort and six measures for the Infant Cohort. These measures are described and evidence supporting the structure of the domains and their underlying latent constructs is provided for both cohorts. The factorial structure of the Outcome Index is adequate for both cohorts, but was stronger for the child than infant cohort. It is concluded that the LSAC Outcome Index is a parsimonious measure representing the major components of development which is suitable for non-specialist data users. A companion paper (Sanson et al. 2010) presents evidence of the validity of the Index.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The critical impact of innovation on national and the global economies has been discussed at length in the literature. Economic development requires the diffusion of innovations into markets. It has long been recognised that economic growth and development depends upon a constant stream of innovations. Governments have been keenly aware of the need to ensure this flow does not dry to a trickle and have introduced many and varied industry policies and interventions to assist in seeding, supporting and diffusing innovations. In Australia, as in many countries, Government support for the transfer of knowledge especially from publicly funded research has resulted in the creation of knowledge exchange intermediaries. These intermediaries are themselves service organisations, seeking innovative service offerings for their markets. The choice for most intermediaries is generally a dichotomous one, between market-pull and technology-push knowledge exchange programmes. In this article, we undertake a case analysis of one such innovative intermediary and its flagship programme. We then compare this case with other successful intermediaries in Europe. We put forward a research proposition that the design of intermediary programmes must match the service type they offer. That is, market-pull programmes require market-pull design, in close collaboration with industry, whereas technology programmes can be problem-solving innovations where demand is latent. The discussion reflects the need for an evolution in knowledge transfer policies and programmes beyond the first generation ushered in with the US Bayh-Dole Act (1980) and Stevenson-Wydler Act (1984). The data analysed is a case study comparison of market-pull and technology-push programmes, focusing on primary and secondary socio-economic benefits (using both Australian and international comparisons).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hamilton (2001) makes a number of comments on our paper (Harding and Pagan, 2002b). The objectives of this rejoinder are, firstly, to note the areas in which we agree; secondly, to define with greater clarity the areas in which we disagree; and, thirdly, to point to other papers, including a longer version of this response, where we have dealt with some of the issues that he raises. The core of our debate with him is whether one should use an algorithm with a specified set of rules for determining the turning points in economic activity or whether one should use a parametric model that features latent states. Hamilton begins his criticism by stating that there is a philosophical distinction between the two methods for dating cycles and concludes that the method we use “leaves vague and intuitive exactly what this algorithm is intended to measure”. Nothing is further from the truth. When seeking ways to decide on whether a turning point has occurred it is always useful to ask the question, what is a recession? Common usage suggests that it is a decline in the level of economic activity that lasts for some time. For this reason it has become standard to describe a recession as a decline in GDP that lasts for more than two quarters. Finding periods in which quarterly GDP declined for two periods is exactly what our approach does. What is vague about this?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this PhD was to further develop Bayesian spatio-temporal models (specifically the Conditional Autoregressive (CAR) class of models), for the analysis of sparse disease outcomes such as birth defects. The motivation for the thesis arose from problems encountered when analyzing a large birth defect registry in New South Wales. The specific components and related research objectives of the thesis were developed from gaps in the literature on current formulations of the CAR model, and health service planning requirements. Data from a large probabilistically-linked database from 1990 to 2004, consisting of fields from two separate registries: the Birth Defect Registry (BDR) and Midwives Data Collection (MDC) were used in the analyses in this thesis. The main objective was split into smaller goals. The first goal was to determine how the specification of the neighbourhood weight matrix will affect the smoothing properties of the CAR model, and this is the focus of chapter 6. Secondly, I hoped to evaluate the usefulness of incorporating a zero-inflated Poisson (ZIP) component as well as a shared-component model in terms of modeling a sparse outcome, and this is carried out in chapter 7. The third goal was to identify optimal sampling and sample size schemes designed to select individual level data for a hybrid ecological spatial model, and this is done in chapter 8. Finally, I wanted to put together the earlier improvements to the CAR model, and along with demographic projections, provide forecasts for birth defects at the SLA level. Chapter 9 describes how this is done. For the first objective, I examined a series of neighbourhood weight matrices, and showed how smoothing the relative risk estimates according to similarity by an important covariate (i.e. maternal age) helped improve the model’s ability to recover the underlying risk, as compared to the traditional adjacency (specifically the Queen) method of applying weights. Next, to address the sparseness and excess zeros commonly encountered in the analysis of rare outcomes such as birth defects, I compared a few models, including an extension of the usual Poisson model to encompass excess zeros in the data. This was achieved via a mixture model, which also encompassed the shared component model to improve on the estimation of sparse counts through borrowing strength across a shared component (e.g. latent risk factor/s) with the referent outcome (caesarean section was used in this example). Using the Deviance Information Criteria (DIC), I showed how the proposed model performed better than the usual models, but only when both outcomes shared a strong spatial correlation. The next objective involved identifying the optimal sampling and sample size strategy for incorporating individual-level data with areal covariates in a hybrid study design. I performed extensive simulation studies, evaluating thirteen different sampling schemes along with variations in sample size. This was done in the context of an ecological regression model that incorporated spatial correlation in the outcomes, as well as accommodating both individual and areal measures of covariates. Using the Average Mean Squared Error (AMSE), I showed how a simple random sample of 20% of the SLAs, followed by selecting all cases in the SLAs chosen, along with an equal number of controls, provided the lowest AMSE. The final objective involved combining the improved spatio-temporal CAR model with population (i.e. women) forecasts, to provide 30-year annual estimates of birth defects at the Statistical Local Area (SLA) level in New South Wales, Australia. The projections were illustrated using sixteen different SLAs, representing the various areal measures of socio-economic status and remoteness. A sensitivity analysis of the assumptions used in the projection was also undertaken. By the end of the thesis, I will show how challenges in the spatial analysis of rare diseases such as birth defects can be addressed, by specifically formulating the neighbourhood weight matrix to smooth according to a key covariate (i.e. maternal age), incorporating a ZIP component to model excess zeros in outcomes and borrowing strength from a referent outcome (i.e. caesarean counts). An efficient strategy to sample individual-level data and sample size considerations for rare disease will also be presented. Finally, projections in birth defect categories at the SLA level will be made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

what was silent will speak, what is closed will open and will take on a voice Paul Virilio The fundamental problem in dealing with the digital is that we are forced to contend with a fundamental deconstruction of form. A deconstruction that renders our content and practice into a single state that can be openly and easily manipulated, reimagined and mashed together in rapid time to create completely unique artefacts and potentially unwranglable jumbles of data. Once our work is essentially broken down into this series of number sequences, (or bytes), our sound, images, movies and documents – our memory files - we are left with nothing but choice….and this is the key concern. This absence of form transforms our work into new collections and poses unique challenges for the artist seeking opportunities to exploit the potential of digital deconstruction. It is through this struggle with the absent form that we are able to thoroughly explore the latent potential of content, exploit modern abstractions of time and devise approaches within our practice that actively deal with the digital as an essential matter of course.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Longitudinal data, where data are repeatedly observed or measured on a temporal basis of time or age provides the foundation of the analysis of processes which evolve over time, and these can be referred to as growth or trajectory models. One of the traditional ways of looking at growth models is to employ either linear or polynomial functional forms to model trajectory shape, and account for variation around an overall mean trend with the inclusion of random eects or individual variation on the functional shape parameters. The identification of distinct subgroups or sub-classes (latent classes) within these trajectory models which are not based on some pre-existing individual classification provides an important methodology with substantive implications. The identification of subgroups or classes has a wide application in the medical arena where responder/non-responder identification based on distinctly diering trajectories delivers further information for clinical processes. This thesis develops Bayesian statistical models and techniques for the identification of subgroups in the analysis of longitudinal data where the number of time intervals is limited. These models are then applied to a single case study which investigates the neuropsychological cognition for early stage breast cancer patients undergoing adjuvant chemotherapy treatment from the Cognition in Breast Cancer Study undertaken by the Wesley Research Institute of Brisbane, Queensland. Alternative formulations to the linear or polynomial approach are taken which use piecewise linear models with a single turning point, change-point or knot at a known time point and latent basis models for the non-linear trajectories found for the verbal memory domain of cognitive function before and after chemotherapy treatment. Hierarchical Bayesian random eects models are used as a starting point for the latent class modelling process and are extended with the incorporation of covariates in the trajectory profiles and as predictors of class membership. The Bayesian latent basis models enable the degree of recovery post-chemotherapy to be estimated for short and long-term followup occasions, and the distinct class trajectories assist in the identification of breast cancer patients who maybe at risk of long-term verbal memory impairment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Home Automation (HA) has emerged as a prominent ¯eld for researchers and in- vestors confronting the challenge of penetrating the average home user market with products and services emerging from technology based vision. In spite of many technology contri- butions, there is a latent demand for a®ordable and pragmatic assistive technologies for pro-active handling of complex lifestyle related problems faced by home users. This study has pioneered to develop an Initial Technology Roadmap for HA (ITRHA) that formulates a need based vision of 10-15 years, identifying market, product and technology investment opportunities, focusing on those aspects of HA contributing to e±cient management of home and personal life. The concept of Family Life Cycle is developed to understand the temporal needs of family. In order to formally describe a coherent set of family processes, their relationships, and interaction with external elements, a reference model named Fam- ily System is established that identi¯es External Entities, 7 major Family Processes, and 7 subsystems-Finance, Meals, Health, Education, Career, Housing, and Socialisation. Anal- ysis of these subsystems reveals Soft, Hard and Hybrid processes. Rectifying the lack of formal methods for eliciting future user requirements and reassessing evolving market needs, this study has developed a novel method called Requirement Elicitation of Future Users by Systems Scenario (REFUSS), integrating process modelling, and scenario technique within the framework of roadmapping. The REFUSS is used to systematically derive process au- tomation needs relating the process knowledge to future user characteristics identi¯ed from scenarios created to visualise di®erent futures with richly detailed information on lifestyle trends thus enabling learning about the future requirements. Revealing an addressable market size estimate of billions of dollars per annum this research has developed innovative ideas on software based products including Document Management Systems facilitating automated collection, easy retrieval of all documents, In- formation Management System automating information services and Ubiquitous Intelligent System empowering the highly mobile home users with ambient intelligence. Other product ideas include robotic devices of versatile Kitchen Hand and Cleaner Arm that can be time saving. Materialisation of these products require technology investment initiating further research in areas of data extraction, and information integration as well as manipulation and perception, sensor actuator system, tactile sensing, odour detection, and robotic controller. This study recommends new policies on electronic data delivery from service providers as well as new standards on XML based document structure and format.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the results of a structural equation model (SEM) that describes and quantifies the relationships between corporate culture and safety performance. The SEM is estimated using 196 individual questionnaire responses from three companies with better than average safety records. A multiattribute analysis of corporate safety culture characteristics resulted in a hierarchical description of corporate safety culture comprised of three major categories — people, process, and value. These three major categories were decomposed into 54 measurable questions and used to develop a questionnaire to quantify corporate safety culture. The SEM identified five latent variables that describe corporate safety culture: (1) a company’s safety commitment; (2) the safety incentives that are offered to field personal for safe performance; (3) the subcontractor involvement in the company culture; (4) the field safety accountability and dedication; and (5) the disincentives for unsafe behaviors. These characteristics of company safety culture serve as indicators for a company’s safety performance. Based on the findings from this limited sample of three companies, this paper proposes a list of practices that companies may consider to improve corporate safety culture and safety performance. A more comprehensive study based on a larger sample is recommended to corroborate the findings of this study.