976 resultados para least common subgraph algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the recent decision of Hunter and New England Local Health District v McKenna; Hunter and New England Local Health District v Simon, the High Court of Australia held that a hospital and its medical staff owed no common law duty of care to third parties claiming for mental harm, against the background of statutory powers to detain mentally ill patients. This conclusion was based in part on the statutory framework and in part on the inconsistency which would arise if such a duty was imposed. If such a duty was imposed in these circumstances, the consequence may be that doctors would generally detain rather than discharge mentally ill persons to avoid the foreseeable risk of harm to others. Such an approach would be inconsistent with the policy of the mental health legislation , which favours personal liberty and discharge rather than detention unless no other care of a less restrictive kind is appropriate and reasonably available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method for reconstruction of an object f(x) x=(x,y,z) from a limited set of cone-beam projection data has been developed. This method uses a modified form of convolution back-projection and projection onto convex sets (POCS) for handling the limited (or incomplete) data problem. In cone-beam tomography, one needs to have a complete geometry to completely reconstruct the original three-dimensional object. While complete geometries do exist, they are of little use in practical implementations. The most common trajectory used in practical scanners is circular, which is incomplete. It is, however, possible to recover some of the information of the original signal f(x) based on a priori knowledge of the nature of f(x). If this knowledge can be posed in a convex set framework, then POCS can be utilized. In this report, we utilize this a priori knowledge as convex set constraints to reconstruct f(x) using POCS. While we demonstrate the effectiveness of our algorithm for circular trajectories, it is essentially geometry independent and will be useful in any limited-view cone-beam reconstruction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A defect-selective photothermal imaging system for the diagnostics of optical coatings is demonstrated. The instrument has been optimized for pump and probe parameters, detector performance, and signal processing algorithm. The imager is capable of mapping purely optical or thermal defects efficiently in coatings of low damage threshold and low absorbance. Detailed mapping of minor inhomogeneities at low pump power has been achieved through the simultaneous action of a low-noise fiber optic photothermal beam defection sensor and a common-mode-rejection demodulation (CMRD) technique. The linearity and sensitivity of the sensor have been examined theoretically and experimentally, and the signal to noise ratio improvement factor is found to be about 110 compared to a conventional bicell photodiode. The scanner is so designed that mapping of static or shock sensitive samples is possible. In the case of a sample with absolute absorptance of 3.8 x 10(-4), a change in absorptance of about 0.005 x 10(-4) has been detected without ambiguity, ensuring a contrast parameter of 760. This is about 1085% improvement over the conventional approach containing a bicell photodiode, at the same pump power. The merits of the system have been demonstrated by mapping two intentionally created damage sites in a MgF2 coating on fused silica at different excitation powers. Amplitude and phase maps were recorded for thermally thin and thick cases, and the results are compared to demonstrate a case which, in conventional imaging, would lead to a deceptive conclusion regarding the type and location of the damage. Also, a residual damage profile created by long term irradiation with high pump power density has been depicted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recognizing similarities and deriving relationships among protein molecules is a fundamental requirement in present-day biology. Similarities can be present at various levels which can be detected through comparison of protein sequences or their structural folds. In some cases similarities obscure at these levels could be present merely in the substructures at their binding sites. Inferring functional similarities between protein molecules by comparing their binding sites is still largely exploratory and not as yet a routine protocol. One of the main reasons for this is the limitation in the choice of appropriate analytical tools that can compare binding sites with high sensitivity. To benefit from the enormous amount of structural data that is being rapidly accumulated, it is essential to have high throughput tools that enable large scale binding site comparison. Results: Here we present a new algorithm PocketMatch for comparison of binding sites in a frame invariant manner. Each binding site is represented by 90 lists of sorted distances capturing shape and chemical nature of the site. The sorted arrays are then aligned using an incremental alignment method and scored to obtain PMScores for pairs of sites. A comprehensive sensitivity analysis and an extensive validation of the algorithm have been carried out. A comparison with other site matching algorithms is also presented. Perturbation studies where the geometry of a given site was retained but the residue types were changed randomly, indicated that chance similarities were virtually non-existent. Our analysis also demonstrates that shape information alone is insufficient to discriminate between diverse binding sites, unless combined with chemical nature of amino acids. Conclusion: A new algorithm has been developed to compare binding sites in accurate, efficient and high-throughput manner. Though the representation used is conceptually simplistic, we demonstrate that along with the new alignment strategy used, it is sufficient to enable binding comparison with high sensitivity. Novel methodology has also been presented for validating the algorithm for accuracy and sensitivity with respect to geometry and chemical nature of the site. The method is also fast and takes about 1/250(th) second for one comparison on a single processor. A parallel version on BlueGene has also been implemented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new approach for assessing power system voltage stability based on artificial feed forward neural network (FFNN). The approach uses real and reactive power, as well as voltage vectors for generators and load buses to train the neural net (NN). The input properties of the NN are generated from offline training data with various simulated loading conditions using a conventional voltage stability algorithm based on the L-index. The performance of the trained NN is investigated on two systems under various voltage stability assessment conditions. Main advantage is that the proposed approach is fast, robust, accurate and can be used online for predicting the L-indices of all the power system buses simultaneously. The method can also be effectively used to determining local and global stability margin for further improvement measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phenotypic convergence is thought to be driven by parallel substitutions coupled with natural selection at the sequence level. Multiple independent evolutionary transitions of mammals to an aquatic environment offer an opportunity to test this thesis. Here, whole genome alignment of coding sequences identified widespread parallel amino acid substitutions in marine mammals; however, the majority of these changes were not unique to these animals. Conversely, we report that candidate aquatic adaptation genes, identified by signatures of likelihood convergence and/or elevated ratio of nonsynonymous to synonymous nucleotide substitution rate, are characterized by very few parallel substitutions and exhibit distinct sequence changes in each group. Moreover, no significant positive correlation was found between likelihood convergence and positive selection in all three marine lineages. These results suggest that convergence in protein coding genes associated with aquatic lifestyle is mainly characterized by independent substitutions and relaxed negative selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Algorithms for planning quasistatic attitude maneuvers based on the Jacobian of the forward kinematic mapping of fully-reversed (FR) sequences of rotations are proposed in this paper. An FR sequence of rotations is a series of finite rotations that consists of initial rotations about the axes of a body-fixed coordinate frame and subsequent rotations that undo these initial rotations. Unlike the Jacobian of conventional systems such as a robot manipulator, the Jacobian of the system manipulated through FR rotations is a null matrix at the identity, which leads to a total breakdown of the traditional Jacobian formulation. Therefore, the Jacobian algorithm is reformulated and implemented so as to synthesize an FR sequence for a desired rotational displacement. The Jacobian-based algorithm presented in this paper identifies particular six-rotation FR sequences that synthesize desired orientations. We developed the single-step and the multiple-step Jacobian methods to accomplish a given task using six-rotation FR sequences. The single-step Jacobian method identifies a specific FR sequence for a given desired orientation and the multiple-step Jacobian algorithm synthesizes physically feasible FR rotations on an optimal path. A comparison with existing algorithms verifies the fast convergence ability of the Jacobian-based algorithm. Unlike closed-form solutions to the inverse kinematics problem, the Jacobian-based algorithm determines the most efficient FR sequence that yields a desired rotational displacement through a simple and inexpensive numerical calculation. The procedure presented here is useful for those motion planning problems wherein the Jacobian is singular or null.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work deals with the formulation and implementation of an energy-momentum conserving algorithm for conducting the nonlinear transient analysis of structures, within the framework of stress-based hybrid elements. Hybrid elements, which are based on a two-field variational formulation, are much less susceptible to locking than conventional displacement-based elements within the static framework. We show that this advantage carries over to the transient case, so that not only are the solutions obtained more accurate, but they are obtained in fewer iterations. We demonstrate the efficacy of the algorithm on a wide range of problems such as ones involving dynamic buckling, complicated three-dimensional motions, et cetera.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple sequential thinning algorithm for peeling off pixels along contours is described. An adaptive algorithm obtained by incorporating shape adaptivity into this sequential process is also given. The distortions in the skeleton at the right-angle and acute-angle corners are minimized in the adaptive algorithm. The asymmetry of the skeleton, which is a characteristic of sequential algorithm, and is due to the presence of T-corners in some of the even-thickness pattern is eliminated. The performance (in terms of time requirements and shape preservation) is compared with that of a modern thinning algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose an iterative estimating equations procedure for analysis of longitudinal data. We show that, under very mild conditions, the probability that the procedure converges at an exponential rate tends to one as the sample size increases to infinity. Furthermore, we show that the limiting estimator is consistent and asymptotically efficient, as expected. The method applies to semiparametric regression models with unspecified covariances among the observations. In the special case of linear models, the procedure reduces to iterative reweighted least squares. Finite sample performance of the procedure is studied by simulations, and compared with other methods. A numerical example from a medical study is considered to illustrate the application of the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Epidemiological and clinical studies suggest comorbidity between prostate cancer (PCA) and cardiovascular disease (CVD) risk factors. However, the relationship between these two phenotypes is still not well understood. Here we sought to identify shared genetic loci between PCA and CVD risk factors. Methods We applied a genetic epidemiology method based on conjunction false discovery rate (FDR) that combines summary statistics from different genome-wide association studies (GWAS), and allows identification of genetic overlap between two phenotypes. We evaluated summary statistics from large, multi-centre GWA studies of PCA (n = 50 000) and CVD risk factors (n = 200 000) [triglycerides (TG), low-density lipoprotein (LDL) cholesterol and high-density lipoprotein (HDL) cholesterol, systolic blood pressure, body mass index, waist-hip ratio and type 2 diabetes (T2D)]. Enrichment of single nucleotide polymorphisms (SNPs) associated with PCA and CVD risk factors was assessed with conditional quantile-quantile plots and the Anderson-Darling test. Moreover, we pinpointed shared loci using conjunction FDR. Results We found the strongest enrichment of P-values in PCA was conditional on LDL and conditional on TG. In contrast, we found only weak enrichment conditional on HDL or conditional on the other traits investigated. Conjunction FDR identified altogether 17 loci; 10 loci were associated with PCA and LDL, 3 loci were associated with PCA and TG and additionally 4 loci were associated with PCA, LDL and TG jointly (conjunction FDR < 0.01). For T2D, we detected one locus adjacent to HNF1B. Conclusions We found polygenic overlap between PCA predisposition and blood lipids, in particular LDL and TG, and identified 17 pleiotropic gene loci between PCA and LDL, and PCA and TG, respectively. These findings provide novel pathobiological insights and may have implications for trials using targeting lipid-lowering agents in a prevention or cancer setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pancreatic exocrine dysfunction has been frequently recorded in protein-energy malnutrition in underdeveloped countries. In addition, the pancreas requires optimal nutrition for enzyme synthesis and potentially correctable pancreatic enzyme insufficiency may play a role in the continuation of protein-energy malnutrition. This problem has not been previously evaluated in Australian Aborigines. We have applied a screening test for pancreatic dysfunction (human immunoreactive trypsinogen [IRT] assay) to the study of 398 infants (6-36 months) admitted to the Alice Springs Hospital over a 20-month period. All infants were assessed by anthropometric measures and were assigned to to three nutritional groups (normal, moderate or severely malnourished) and two growth groups (stunted or not stunted). Of the 198 infants who had at least a single serum cationic trypsinogen measurement taken, normal values for serum IRT (with confidence limits) were obtained from 57 children, who were normally nourished. IRT levels were significantly correlated with the degree of underweight but there was no correlation with the degree of stunting or age. Mean IRT levels for the moderate and severely underweight groups were significantly greater than the mean for the normal group (P < 0.01). Seventeen children (8.6%) had trypsinogen levels in excess of the 95th percentile for the normally nourished group, reflecting acinar cell damage or ductal obstruction. We conclude that pancreatic dysfunction may be a common and important overlooked factor contributing to ongoing malnutrition and diseases in malnourished Australian Aboriginal children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents some results from preliminary analyses of the data of an international online survey of bicycle riders, who reported riding at least once a month. On 4 July 2015, data from 7528 participants from 17 countries was available in the survey, and were subsequently cleaned and checked for consistency. The median distance ridden ranged from 30 km/week in Israel to 150 km/week in Greece (overall median 54 km/week). City/hybrid bicycles were the most common type of bicycle ridden (44%), followed by mountain (20%) and road bikes (15%). Almost half (47%) of the respondents rode “nearly daily”. About a quarter rode daily to work or study (27%). Overall, 40% of respondents reported wearing a helmet ‘always’, varying from 2% in the Netherlands to 80% in Norway, while 25% reported ‘never’ wearing a helmet. Thus, individuals appeared to consistently either use or not use helmets. Helmet wearing rates were generally higher when riding for health/fitness than other purposes and appeared to be little affected by the type of riding location, but some divergences in these patterns were found among countries. Almost 29% of respondents reported being involved in at least one bicycle crash in the last year (ranging from 12% in Israel to 53% in Turkey). Among the most severe crashes for each respondent, about half of the crashes involved falling off a bicycle. Just under 10% of the most severe crashes for each respondent were reported to police. Among the bicycle-motor vehicle crashes, only a third were reported to police. Further analyses will address questions regarding the influence of factors such as demographic characteristics, type of bicycle ridden, and attitudes on both bi-cycle use and helmet wearing rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of tracking a maneuvering target in clutter. In such an environment, missed detections and false alarms make it impossible to decide, with certainty, the origin of received echoes. Processing radar returns in cluttered environments consists of three functions: 1) target detection and plot formation, 2) plot-to-track association, and 3) track updating. Two inadequacies of the present approaches are 1) Optimization of detection characteristics have not been considered and 2) features that can be used in the plot-to-track correlation process are restricted to a specific class. This paper presents a new approach to overcome these limitations. This approach facilitates tracking of a maneuvering target in clutter and improves tracking performance for weak targets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study reports a diachronic corpus investigation of common-number pronouns used to convey unknown or otherwise unspecified reference. The study charts agreement patterns in these pronouns in various diachronic and synchronic corpora. The objective is to provide base-line data on variant frequencies and distributions in the history of English, as there are no previous systematic corpus-based observations on this topic. This study seeks to answer the questions of how pronoun use is linked with the overall typological development in English and how their diachronic evolution is embedded in the linguistic and social structures in which they are used. The theoretical framework draws on corpus linguistics and historical sociolinguistics, grammaticalisation, diachronic typology, and multivariate analysis of modelling sociolinguistic variation. The method employs quantitative corpus analyses from two main electronic corpora, one from Modern English and the other from Present-day English. The Modern English material is the Corpus of Early English Correspondence, and the time frame covered is 1500-1800. The written component of the British National Corpus is used in the Present-day English investigations. In addition, the study draws supplementary data from other electronic corpora. The material is used to compare the frequencies and distributions of common-number pronouns between these two time periods. The study limits the common-number uses to two subsystems, one anaphoric to grammatically singular antecedents and one cataphoric, in which the pronoun is followed by a relative clause. Various statistical tools are used to process the data, ranging from cross-tabulations to multivariate VARBRUL analyses in which the effects of sociolinguistic and systemic parameters are assessed to model their impact on the dependent variable. This study shows how one pronoun type has extended its uses in both subsystems, an increase linked with grammaticalisation and the changes in other pronouns in English through the centuries. The variationist sociolinguistic analysis charts how grammaticalisation in the subsystems is embedded in the linguistic and social structures in which the pronouns are used. The study suggests a scale of two statistical generalisations of various sociolinguistic factors which contribute to grammaticalisation and its embedding at various stages of the process.