899 resultados para information bottleneck method
Resumo:
In an adaptive seamless phase II/III clinical trial interim analysis, data are used for treatment selection, enabling resources to be focused on comparison of more effective treatment(s) with a control. In this paper, we compare two methods recently proposed to enable use of short-term endpoint data for decision-making at the interim analysis. The comparison focuses on the power and the probability of correctly identifying the most promising treatment. We show that the choice of method depends on how well short-term data predict the best treatment, which may be measured by the correlation between treatment effects on short- and long-term endpoints.
Resumo:
Accurate and reliable rain rate estimates are important for various hydrometeorological applications. Consequently, rain sensors of different types have been deployed in many regions. In this work, measurements from different instruments, namely, rain gauge, weather radar, and microwave link, are combined for the first time to estimate with greater accuracy the spatial distribution and intensity of rainfall. The objective is to retrieve the rain rate that is consistent with all these measurements while incorporating the uncertainty associated with the different sources of information. Assuming the problem is not strongly nonlinear, a variational approach is implemented and the Gauss–Newton method is used to minimize the cost function containing proper error estimates from all sensors. Furthermore, the method can be flexibly adapted to additional data sources. The proposed approach is tested using data from 14 rain gauges and 14 operational microwave links located in the Zürich area (Switzerland) to correct the prior rain rate provided by the operational radar rain product from the Swiss meteorological service (MeteoSwiss). A cross-validation approach demonstrates the improvement of rain rate estimates when assimilating rain gauge and microwave link information.
Resumo:
Objectives. While older adults often display memory deficits, with practice they can sometimes selectively remember valuable information at the expense of less value information. We examined age-related differences and similarities in memory for health-related information under conditions where some information was critical to remember. Method. In Experiment 1, participants studied three lists of allergens, ranging in severity from 0 (not a health risk) to 10 (potentially fatal), with the instruction that it was particularly important to remember items to which a fictional relative was most severely allergic. After each list, participants received feedback regarding their recall of the high-value allergens. Experiment 2 examined memory for health benefits, presenting foods that were potentially beneficial to the relative’s immune system. Results. While younger adults exhibited better overall memory for the allergens, both age groups in Experiment 1 developed improved selectivity across the lists, with no evident age differences in severe allergen recall by List 2. Selectivity also developed in Experiment 2, although age differences for items of high health benefit were present. Discussion. The results have implications for models of selective memory in older age, and for how aging influences the ability to strategically remember important information within health-related contexts.
Resumo:
Seamless phase II/III clinical trials in which an experimental treatment is selected at an interim analysis have been the focus of much recent research interest. Many of the methods proposed are based on the group sequential approach. This paper considers designs of this type in which the treatment selection can be based on short-term endpoint information for more patients than have primary endpoint data available. We show that in such a case, the familywise type I error rate may be inflated if previously proposed group sequential methods are used and the treatment selection rule is not specified in advance. A method is proposed to avoid this inflation by considering the treatment selection that maximises the conditional error given the data available at the interim analysis. A simulation study is reported that illustrates the type I error rate inflation and compares the power of the new approach with two other methods: a combination testing approach and a group sequential method that does not use the short-term endpoint data, both of which also strongly control the type I error rate. The new method is also illustrated through application to a study in Alzheimer's disease. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Resumo:
Key Performance Indicators (KPIs) are the main instruments of Business Performance Management. KPIs are the measures that are translated to both the strategy and the business process. These measures are often designed for an industry sector with the assumptions about business processes in organizations. However, the assumptions can be too incomplete to guarantee the required properties of KPIs. This raises the need to validate the properties of KPIs prior to their application to performance measurement. This paper applies the method called EXecutable Requirements Engineering Management and Evolution (EXTREME) for validation of the KPI definitions. EXTREME semantically relates the goal modeling, conceptual modeling and protocol modeling techniques into one methodology. The synchronous composition built into protocol modeling enables raceability of goals in protocol models and constructive definitions of a KPI. The application of the method clarifies the meaning of KPI properties and procedures of their assessment and validation.
Resumo:
Imagery registration is a fundamental step, which greatly affects later processes in image mosaic, multi-spectral image fusion, digital surface modelling, etc., where the final solution needs blending of pixel information from more than one images. It is highly desired to find a way to identify registration regions among input stereo image pairs with high accuracy, particularly in remote sensing applications in which ground control points (GCPs) are not always available, such as in selecting a landing zone on an outer space planet. In this paper, a framework for localization in image registration is developed. It strengthened the local registration accuracy from two aspects: less reprojection error and better feature point distribution. Affine scale-invariant feature transform (ASIFT) was used for acquiring feature points and correspondences on the input images. Then, a homography matrix was estimated as the transformation model by an improved random sample consensus (IM-RANSAC) algorithm. In order to identify a registration region with a better spatial distribution of feature points, the Euclidean distance between the feature points is applied (named the S criterion). Finally, the parameters of the homography matrix were optimized by the Levenberg–Marquardt (LM) algorithm with selective feature points from the chosen registration region. In the experiment section, the Chang’E-2 satellite remote sensing imagery was used for evaluating the performance of the proposed method. The experiment result demonstrates that the proposed method can automatically locate a specific region with high registration accuracy between input images by achieving lower root mean square error (RMSE) and better distribution of feature points.
Resumo:
During the last few years Enterprise Architecture has received increasing attention among industry and academia. Enterprise Architecture (EA) can be defined as (i) a formal description of the current and future state(s) of an organisation, and (ii) a managed change between these states to meet organisation’s stakeholders’ goals and to create value to the organisation. By adopting EA, organisations may gain a number of benefits such as better decision making, increased revenues and cost reductions, and alignment of business and IT. To increase the performance of public sector operations, and to improve public services and their availability, the Finnish Parliament has ratified the Act on Information Management Governance in Public Administration in 2011. The Act mandates public sector organisations to start adopting EA by 2014, including Higher Education Institutions (HEIs). Despite the benefits of EA and the Act, EA adoption level and maturity in Finnish HEIs are low. This is partly caused by the fact that EA adoption has been found to be difficult. Thus there is a need for a solution to help organisations to adopt EA successfully. This thesis follows Design Science (DS) approach to improve traditional EA adoption method in order to increase the likelihood of successful adoption. First a model is developed to explain the change resistance during EA adoption. To find out problems associated with EA adoption, an EA-pilot conducted in 2010 among 12 Finnish HEIs was analysed using the model. It was found that most of the problems were caused by misunderstood EA concepts, attitudes, and lack of skills. The traditional EA adoption method does not pay attention to these. To overcome the limitations of the traditional EA adoption method, an improved EA Adoption Method (EAAM) is introduced. By following EAAM, organisations may increase the likelihood of successful EA adoption. EAAM helps in acquiring the mandate for EA adoption from top-management, which has been found to be crucial to success. It also helps in supporting individual and organisational learning, which has also found to be essential in successful adoption.
Resumo:
Background Access to, and the use of, information and communication technology (ICT) is increasingly becoming a vital component of mainstream life. First-order (e.g. time and money) and second-order factors (e.g. beliefs of staff members) affect the use of ICT in different contexts. It is timely to investigate what these factors may be in the context of service provision for adults with intellectual disabilities given the role ICT could play in facilitating communication and access to information and opportunities as suggested in Valuing People. Method Taking a qualitative approach, nine day service sites within one organization were visited over a period of 6 months to observe ICT-related practice and seek the views of staff members working with adults with intellectual disabilities. All day services were equipped with modern ICT equipment including computers, digital cameras, Internet connections and related peripherals. Results Staff members reported time, training and budget as significant first-order factors. Organizational culture and beliefs about the suitability of technology for older or less able service users were the striking second-order factors mentioned. Despite similar levels of equipment, support and training, ICT use had developed in very different ways across sites. Conclusion The provision of ICT equipment and training is not sufficient to ensure their use; the beliefs of staff members and organizational culture of sites play a substantial role in how ICT is used with and by service users. Activity theory provides a useful framework for considering how first- and second-order factors are related. Staff members need to be given clear information about the broader purpose of activities in day services, especially in relation to the lifelong learning agenda, in order to see the relevance and usefulness of ICT resources for all service users.
Resumo:
Noncompetitive bids have recently become a major concern in both public and private sector construction contract auctions. Consequently, several models have been developed to help identify bidders potentially involved in collusive practices. However, most of these models require complex calculations and extensive information that is difficult to obtain. The aim of this paper is to utilize recent developments for detecting abnormal bids in capped auctions (auctions with an upper bid limit set by the auctioner) and extend them to the more conventional uncapped auctions (where no such limits are set). To accomplish this, a new method is developed for estimating the values of bid distribution supports by using the solution to what has become known as the German Tank problem. The model is then demonstrated and tested on a sample of real construction bid data, and shown to detect cover bids with high accuracy. This paper contributes to an improved understanding of abnormal bid behavior as an aid to detecting and monitoring potential collusive bid practices.
Resumo:
In numerical weather prediction, parameterisations are used to simulate missing physics in the model. These can be due to a lack of scientific understanding or a lack of computing power available to address all the known physical processes. Parameterisations are sources of large uncertainty in a model as parameter values used in these parameterisations cannot be measured directly and hence are often not well known; and the parameterisations themselves are also approximations of the processes present in the true atmosphere. Whilst there are many efficient and effective methods for combined state/parameter estimation in data assimilation (DA), such as state augmentation, these are not effective at estimating the structure of parameterisations. A new method of parameterisation estimation is proposed that uses sequential DA methods to estimate errors in the numerical models at each space-time point for each model equation. These errors are then fitted to pre-determined functional forms of missing physics or parameterisations that are based upon prior information. We applied the method to a one-dimensional advection model with additive model error, and it is shown that the method can accurately estimate parameterisations, with consistent error estimates. Furthermore, it is shown how the method depends on the quality of the DA results. The results indicate that this new method is a powerful tool in systematic model improvement.
Resumo:
Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.
Resumo:
The South American fur seal, Arctocephalus australis, was one of the earliest otariid seals to be exploited by humans: at least 6000 years ago on the Atlantic coast and 4000 on the Pacific coast of South America. More than 750,000 fur seals were killed in Uruguay until 1991. However, a climatological phenomenon-the severe 1997-1998 El Nino Southern Oscillation (ENSO)-was responsible for the decline of 72% Of the Peruvian fur seal population due to starvation as a consequence of warming of sea-surface temperatures and primary productivity reduction. Currently, there is no precise information on global population size or on the species` conservation status. The present study includes the first bottleneck test for the Pacific and Atlantic populations of A. australis based on the analysis of seven microsatellite loci. Genetic bottleneck compromises the evolutionary potential of a population to respond to environmental changes. The perspective becomes even more alarming due to current global warming models that predict stronger and more frequent ENSO events in the future. Our analysis found moderate support for deviation from neutrality-equilibrium for the Pacific population of fur seals and none for the Atlantic population. This difference among population reflects different demographic histories, and is consistent with a greater reduction in population size in the Pacific. Such an event could be a result of the synergic effects of recurrent ENSO events and the anthropogenic impact (sealing and prey overfishing) on this population.
Resumo:
NMR quantum information processing studies rely on the reconstruction of the density matrix representing the so-called pseudo-pure states (PPS). An initially pure part of a PPS state undergoes unitary and non-unitary (relaxation) transformations during a computation process, causing a ""loss of purity"" until the equilibrium is reached. Besides, upon relaxation, the nuclear polarization varies in time, a fact which must be taken into account when comparing density matrices at different instants. Attempting to use time-fixed normalization procedures when relaxation is present, leads to various anomalies on matrices populations. On this paper we propose a method which takes into account the time-dependence of the normalization factor. From a generic form for the deviation density matrix an expression for the relaxing initial pure state is deduced. The method is exemplified with an experiment of relaxation of the concurrence of a pseudo-entangled state, which exhibits the phenomenon of sudden death, and the relaxation of the Wigner function of a pseudo-cat state.
Resumo:
Alzheimer`s disease is an ultimately fatal neurodegenerative disease, and BACE-1 has become an attractive validated target for its therapy, with more than a hundred crystal structures deposited in the PDB. In the present study, we present a new methodology that integrates ligand-based methods with structural information derived from the receptor. 128 BACE-1 inhibitors recently disclosed by GlaxoSmithKline R&D were selected specifically because the crystal structures of 9 of these compounds complexed to BACE-1, as well as five closely related analogs, have been made available. A new fragment-guided approach was designed to incorporate this wealth of structural information into a CoMFA study, and the methodology was systematically compared to other popular approaches, such as docking, for generating a molecular alignment. The influence of the partial charges calculation method was also analyzed. Several consistent and predictive models are reported, including one with r (2) = 0.88, q (2) = 0.69 and r (pred) (2) = 0.72. The models obtained with the new methodology performed consistently better than those obtained by other methodologies, particularly in terms of external predictive power. The visual analyses of the contour maps in the context of the enzyme drew attention to a number of possible opportunities for the development of analogs with improved potency. These results suggest that 3D-QSAR studies may benefit from the additional structural information added by the presented methodology.
Resumo:
This presentation was offered as part of the CUNY Library Assessment Conference, Reinventing Libraries: Reinventing Assessment, held at the City University of New York in June 2014.