925 resultados para Generalized Convexity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Mood and anxiety disorders pose significant health burdens on the community. Kava and St John’s wort (SJW) are the most commonly used herbal medicines in the treatment of anxiety and depressive disorders, respectively. Objectives: To conduct a comprehensive review of kava and SJW, to review any evidence of efficacy, mode of action, pharmacokinetics, safety and use in Major Depressive Disorder (MDD), Bipolar Disorder (BP), Seasonal Affective Disorder (SAD), Generalized Anxiety Disorder (GAD), Social Phobia (SP), Panic Disorder (PD), Obsessive-Compulsive Disorder (OCD), and Post Traumatic Stress Disorder (PTSD). Methods: A systematic review was conducted using the electronic databases MEDLINE, CINAHL, and The Cochrane Library during late 2008. The search criteria involved mood and anxiety disorder search terms in combination with kava, Piper methysticum, kavalactones, St John’s wort, Hypericum perforatum, hypericin and hyperforin. Additional search criteria for safety, pharmacodynamics , and pharmacokinetics was employed. A subsequent forward search was conducted of the papers using Web of Science cited reference search. Results: Current evidence supports the use of SJW in treating mild-moderate depression, and for kava in treatment of generalized anxiety. In respect to the other disorders, only weak preliminary evidence exists for use of SJW in SAD. Currently there is no published human trial on use of kava in affective disorders, or in OCD, PTSD, PD or SP. These disorders constitute potential applications that warrant exploration. Conclusions: Current evidence for herbal medicines in the treatment of depression and anxiety only supports the use of Hypericum perforatum for depression, and Piper methysticum for generalized anxiety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ad hoc networks are vulnerable to attacks due to distributed nature and lack of infrastructure. Intrusion detection systems (IDS) provide audit and monitoring capabilities that offer the local security to a node and help to perceive the specific trust level of other nodes. The clustering protocols can be taken as an additional advantage in these processing constrained networks to collaboratively detect intrusions with less power usage and minimal overhead. Existing clustering protocols are not suitable for intrusion detection purposes, because they are linked with the routes. The route establishment and route renewal affects the clusters and as a consequence, the processing and traffic overhead increases due to instability of clusters. The ad hoc networks are battery and power constraint, and therefore a trusted monitoring node should be available to detect and respond against intrusions in time. This can be achieved only if the clusters are stable for a long period of time. If the clusters are regularly changed due to routes, the intrusion detection will not prove to be effective. Therefore, a generalized clustering algorithm has been proposed that can run on top of any routing protocol and can monitor the intrusions constantly irrespective of the routes. The proposed simplified clustering scheme has been used to detect intrusions, resulting in high detection rates and low processing and memory overhead irrespective of the routes, connections, traffic types and mobility of nodes in the network. Clustering is also useful to detect intrusions collaboratively since an individual node can neither detect the malicious node alone nor it can take action against that node on its own.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

GMPLS is a generalized form of MPLS (MultiProtocol Label Switching). MPLS is IP packet based and it uses MPLS-TE for Packet Traffic Engineering. GMPLS is extension to MPLS capabilities. It provides separation between transmission, control and management plane and network management. Control plane allows various applications like traffic engineering, service provisioning, and differentiated services. GMPLS control plane architecture includes signaling (RSVP-TE, CR-LDP) and routing (OSPF-TE, ISIS-TE) protocols. This paper provides an overview of the signaling protocols, describes their main functionalities, and provides a general evaluation of both the protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation is primarily an applied statistical modelling investigation, motivated by a case study comprising real data and real questions. Theoretical questions on modelling and computation of normalization constants arose from pursuit of these data analytic questions. The essence of the thesis can be described as follows. Consider binary data observed on a two-dimensional lattice. A common problem with such data is the ambiguity of zeroes recorded. These may represent zero response given some threshold (presence) or that the threshold has not been triggered (absence). Suppose that the researcher wishes to estimate the effects of covariates on the binary responses, whilst taking into account underlying spatial variation, which is itself of some interest. This situation arises in many contexts and the dingo, cypress and toad case studies described in the motivation chapter are examples of this. Two main approaches to modelling and inference are investigated in this thesis. The first is frequentist and based on generalized linear models, with spatial variation modelled by using a block structure or by smoothing the residuals spatially. The EM algorithm can be used to obtain point estimates, coupled with bootstrapping or asymptotic MLE estimates for standard errors. The second approach is Bayesian and based on a three- or four-tier hierarchical model, comprising a logistic regression with covariates for the data layer, a binary Markov Random field (MRF) for the underlying spatial process, and suitable priors for parameters in these main models. The three-parameter autologistic model is a particular MRF of interest. Markov chain Monte Carlo (MCMC) methods comprising hybrid Metropolis/Gibbs samplers is suitable for computation in this situation. Model performance can be gauged by MCMC diagnostics. Model choice can be assessed by incorporating another tier in the modelling hierarchy. This requires evaluation of a normalization constant, a notoriously difficult problem. Difficulty with estimating the normalization constant for the MRF can be overcome by using a path integral approach, although this is a highly computationally intensive method. Different methods of estimating ratios of normalization constants (N Cs) are investigated, including importance sampling Monte Carlo (ISMC), dependent Monte Carlo based on MCMC simulations (MCMC), and reverse logistic regression (RLR). I develop an idea present though not fully developed in the literature, and propose the Integrated mean canonical statistic (IMCS) method for estimating log NC ratios for binary MRFs. The IMCS method falls within the framework of the newly identified path sampling methods of Gelman & Meng (1998) and outperforms ISMC, MCMC and RLR. It also does not rely on simplifying assumptions, such as ignoring spatio-temporal dependence in the process. A thorough investigation is made of the application of IMCS to the three-parameter Autologistic model. This work introduces background computations required for the full implementation of the four-tier model in Chapter 7. Two different extensions of the three-tier model to a four-tier version are investigated. The first extension incorporates temporal dependence in the underlying spatio-temporal process. The second extensions allows the successes and failures in the data layer to depend on time. The MCMC computational method is extended to incorporate the extra layer. A major contribution of the thesis is the development of a fully Bayesian approach to inference for these hierarchical models for the first time. Note: The author of this thesis has agreed to make it open access but invites people downloading the thesis to send her an email via the 'Contact Author' function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The two-stage Total Laparoscopic Hysterectomy (TLH) versus Total Abdominal Hysterectomy (TAH) for stage I endometrial cancer (LACE) randomised controlled trial was initiated in 2005. The primary objective of stage 1 was to assess whether TLH results in equivalent or improved QoL up to 6 months after surgery compared to TAH. The primary objective of stage 2 was to test the hypothesis that disease-free survival at 4.5 years is equivalent for TLH and TAH. Results addressing the primary objective of stage 1 of the LACE trial are presented here. Methods: The first 361 LACE participants (TAH n= 142, TLH n=190) were enrolled in the QoL substudy at 19 centres across Australia, New Zealand and Hong Kong, and 332 completed the QoL analysis. Randomisation was performed centrally and independently from other study procedures via a computer generated, web-based system (providing concealment of the next assigned treatment) using stratified permuted blocks of 3 and 6, and assigned patients with histologically confirmed stage 1 endometrioid endometrial adenocarcinoma and ECOG performance status <2 to TLH or TAH stratified by histological grade and study centre. No blinding of patients or study personnel was attempted. QoL was measured at baseline, 1 and 4 weeks (early), and 3 and 6 months (late) after surgery using the Functional Assessment of Cancer Therapy-General (FACT-G) questionnaire. The primary endpoint was the difference between the groups in QoL change from baseline at early and late time points (a 5% difference was considered clinically significant). Analysis was performed according to the intention-to-treat principle using generalized estimating equations on differences from baseline for the early and late QoL recovery. The LACE trial is registered with clinicaltrials.gov (NCT00096408) and the Australian New Zealand Clinical Trials Registry (CTRN12606000261516). Patients for both stages of the trial have now been recruited and are being followed up for disease-specific outcomes. Findings: The proportion of missing values at the 5%, 10% 15% and 20% differences in the FACT-G scale was 6% (12/190) in the TLH and 14% (20/142) in the TAH group. There were 8/332 conversions (2.4%, 7 of which were from TLH to TAH). In the early phase of recovery, patients undergoing TLH reported significantly greater improvement of QoL from baseline compared to TAH in all subscales except the emotional and social well-being subscales. Improvements in QoL up to 6 months post-surgery continued to favour TLH except for the emotional and social well-being of the FACT and the visual analogue scale of the EuroQoL five dimensions (EuroQoL-VAS). Length of operating time was significantly longer in the TLH group (138±43 mins), than in the TAH group at (109±34 mins; p=0.001). While the proportion of intraoperative adverse events was similar between the treatment groups (TAH 8/142, 5.6%; TLH 14/190, 7.4%; p=0.55), postoperatively, twice as many patients in the TAH group experienced adverse events of CTC grade 3+ than in the TLH group (33/142, 23.2% and 22/190, 11.6%, respectively; p=0.004). Postoperative serious adverse events occurred more frequently in patients who had a TAH (27/142, 19.0%) than a TLH (15/190, 7.9%) (p=0.002). Interpretation: QoL improvements from baseline during early and later phases of recovery, and the adverse event profile significantly favour TLH compared to TAH for patients treated for Stage I endometrial cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project discusses a component of the research study conducted to provide construction organizations with a generic benchmarking framework to assess their extent of information communication technology (ICT) adoption for building project management processes. It defines benchmarking and discusses objectives of the required benchmarking framework and development of the framework. The study focuses on ICT adoption by small and medium enterprises (SMEs) in the construction industry and with respect to SMEs it is important to understand processes, their indicators, and measures in the local context. Structure of the suggested benchmarking framework has been derived after extensive literature survey and a questionnaire survey conducted in the Indian construction industry. The suggested benchmarking process is an iterative process divided into four stages. It can be implemented at organization and industry levels for rating the construction organizations for ICT adoption and performance measurement. The framework has a generic structure and can be generalized and applied for other countries with due considerations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many developing countries are afflicted by persistent inequality in the distribution of income. While a growing body of literature emphasizes differential fertility as a channel through which income inequality persists, this paper investigates differential child mortality – differences in the incidence of child mortality across socioeconomic groups – as a critical link in this regard. Using evidence from cross-country data to evaluate this linkage, we find that differential child mortality serves as a stronger channel than differential fertility in the transmission of income inequality over time. We use random effects and generalized estimating equations techniques to account for temporal correlation within countries. The results are robust to the use of an alternate definition of fertility that reflects parental preference for children instead of realized fertility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Speeding is recognized as a major contributing factor in traffic crashes. In order to reduce speed-related crashes, the city of Scottsdale, Arizona implemented the first fixed-camera photo speed enforcement program (SEP) on a limited access freeway in the US. The 9-month demonstration program spanning from January 2006 to October 2006 was implemented on a 6.5 mile urban freeway segment of Arizona State Route 101 running through Scottsdale. This paper presents the results of a comprehensive analysis of the impact of the SEP on speeding behavior, crashes, and the economic impact of crashes. The impact on speeding behavior was estimated using generalized least square estimation, in which the observed speeds and the speeding frequencies during the program period were compared to those during other periods. The impact of the SEP on crashes was estimated using 3 evaluation methods: a before-and-after (BA) analysis using a comparison group, a BA analysis with traffic flow correction, and an empirical Bayes BA analysis with time-variant safety. The analysis results reveal that speeding detection frequencies (speeds> or =76 mph) increased by a factor of 10.5 after the SEP was (temporarily) terminated. Average speeds in the enforcement zone were reduced by about 9 mph when the SEP was implemented, after accounting for the influence of traffic flow. All crash types were reduced except rear-end crashes, although the estimated magnitude of impact varies across estimation methods (and their corresponding assumptions). When considering Arizona-specific crash related injury costs, the SEP is estimated to yield about $17 million in annual safety benefits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Embedded generalized markup, as applied by digital humanists to the recording and studying of our textual cultural heritage, suffers from a number of serious technical drawbacks. As a result of its evolution from early printer control languages, generalized markup can only express a document’s ‘logical’ structure via a repertoire of permissible printed format structures. In addition to the well-researched overlap problem, the embedding of markup codes into texts that never had them when written leads to a number of further difficulties: the inclusion of potentially obsolescent technical and subjective information into texts that are supposed to be archivable for the long term, the manual encoding of information that could be better computed automatically, and the obscuring of the text by highly complex technical data. Many of these problems can be alleviated by asserting a separation between the versions of which many cultural heritage texts are composed, and their content. In this way the complex inter-connections between versions can be handled automatically, leaving only simple markup for individual versions to be handled by the user.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enterprise Architectures have emerged as comprehensive corporate artefacts that provide structure to the plethora of conceptual views on an enterprise. The recent popularity of a service-oriented design of organizations has added service and related constructs as a new element that requires consideration within an Enterprise Architecture. This paper analyzes and compares the existing proposals for how to best integrate services into Enterprise Architectures. It uses the popular Zachman Framework as an example and differentiates the existing integration alternatives. This research can be generalized beyond service integration into an investigation onto how to possibly extend Enterprise Architectures with emerging constructs.