939 resultados para good lives model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent evidence emerging from several laboratories, integrated with new data obtained by searching the genome databases, suggests that the area code hypothesis provides a good heuristic model for explaining the remarkable specificity of cell migration and tissue assembly that occurs throughout embryogenesis. The area code hypothesis proposes that cells assemble organisms, including their brains and nervous systems, with the aid of a molecular-addressing code that functions much like the country, area, regional, and local portions of the telephone dialing system. The complexity of the information required to code cells for the construction of entire organisms is so enormous that we assume that the code must make combinatorial use of members of large multigene families. Such a system would reuse the same receptors as molecular digits in various regions of the embryo, thus greatly reducing the total number of genes required. We present the hypothesis that members of the very large families of olfactory receptors and vomeronasal receptors fulfill the criteria proposed for area code molecules and could serve as the last digits in such a code. We discuss our evidence indicating that receptors of these families are expressed in many parts of developing embryos and suggest that they play a key functional role in cell recognition and targeting not only in the olfactory system but also throughout the brain and numerous other organs as they are assembled.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis describes the Generative Topographic Mapping (GTM) --- a non-linear latent variable model, intended for modelling continuous, intrinsically low-dimensional probability distributions, embedded in high-dimensional spaces. It can be seen as a non-linear form of principal component analysis or factor analysis. It also provides a principled alternative to the self-organizing map --- a widely established neural network model for unsupervised learning --- resolving many of its associated theoretical problems. An important, potential application of the GTM is visualization of high-dimensional data. Since the GTM is non-linear, the relationship between data and its visual representation may be far from trivial, but a better understanding of this relationship can be gained by computing the so-called magnification factor. In essence, the magnification factor relates the distances between data points, as they appear when visualized, to the actual distances between those data points. There are two principal limitations of the basic GTM model. The computational effort required will grow exponentially with the intrinsic dimensionality of the density model. However, if the intended application is visualization, this will typically not be a problem. The other limitation is the inherent structure of the GTM, which makes it most suitable for modelling moderately curved probability distributions of approximately rectangular shape. When the target distribution is very different to that, theaim of maintaining an `interpretable' structure, suitable for visualizing data, may come in conflict with the aim of providing a good density model. The fact that the GTM is a probabilistic model means that results from probability theory and statistics can be used to address problems such as model complexity. Furthermore, this framework provides solid ground for extending the GTM to wider contexts than that of this thesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study aims to investigate to what extent the views of the managers of the enterprises to be privatized are a barrier to smooth implementation of privatization as opposed to other problems. Accordingly, the research tackles two main issues: Identification and analysis of the major problems encountered in the implementation of the Egyptian privatization programme and at which level these problems exist while proposing different approaches to tackle them; and views of public sector top and middle-level managers regarding the main issues of privatization. The study relies upon a literature survey, interviews with stakeholders, a survey of managers' attitudes and several illustrative case studies. A model of "good practice" for the smooth and effective implementation of privatization has been designed. Practice in Egypt has then been studied and compared with the "good practice" model. Lack of strictness and firmness in implementing the announced privatization programme has been found to be a characteristic of Egyptian practice. This is partly attributable to the inadequacy of the programme and partly to the different obstacles to implementation. The main obstacles are doubtful desirability of privatization on the part of the managers at different implementation levels, resistance of stakeholders, in adequately of the legal framework governing privatization, redundant labour, lack of an efficient monitoring system allowing for accountability, inefficient marketing of privatization, ineffective communication, insufficient information at different levels and problems related to valuation and selling procedures. A large part of the thesis is concerned with SOE (State Owned Enterprise) managers' attitudes on and understanding of the privatization (appraised through surveys). Although most managers have stated their acceptance of privatization, many of their responses show that they do not accept selling SOEs. They understand privatization to include enterprise reform and restructuring, changing procedures and giving more authority to company executives, but not necessarily as selling SOEs. The majority of managers still see many issues that have to be addressed for smooth implementation of privatization e.g. insufficiency of information, incompleteness of legal framework, restructuring and labour problems. The main contribution to knowledge of this thesis is the study of problems of implementing privatization in developing countries especially managers' resistance to privatization as a major change, partly because of the threat it poses and partly because of the lack of understanding of privatization and implications of operating private businesses. A programme of persuading managers and offsetting the unfavourable effects is recommended as an outcome of the study. Five different phrases and words for the national Index to theses are: Egypt, privatization, implementation of privatization, problems of implementing privatization and managers' attitudes towards privatization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In an Arab oil producing country in the Middle East such as Kuwait, Oil industry is considered as the main and most important industry of the country. This industry’s importance emerged from the significant role it plays in both country’s national economy and also global economy. Moreover, Oil industry’s criticality comes from its interconnectivity with national security and power in the Middle East region. Hence, conducting this research in this crucial industry had certainly added values to companies in this industry as it investigated thoroughly the main components of the TQM implementation process and identified which components affects significantly TQM’s implementation and its gained business results. In addition, as the Oil sector is a large sector that is known for its richness of employees with different national cultures and backgrounds. Thus, this culture-heterogeneous industry seems to be the most appropriate environment to address and satisfy a need in the literature to investigate the national culture values’ effects on TQM implementation process. Furthermore, this research has developed a new conceptual model of TQM implementation process in the Kuwaiti Oil industry that applies in general to operations and productions organizations at the Kuwaiti business environment and in specific to organizations in the Oil industry, as well it serves as a good theoretical model for improving operations and production level of the oil industry in other developing and developed countries. Thus, such research findings minimized the literature’s gap found the limited amount of empirical research of TQM implementation in well-developed industries existing in an Arab, developing countries and specifically in Kuwait, where there was no coherent national model for a universal TQM implementation in the Kuwaiti Oil industry in specific and Kuwaiti business environment in general. Finally, this newly developed research framework, which emerged from the literature search, was validated by rigorous quantitative analysis tools including SPSS and Structural Equation Modeling. The quantitative findings of questionnaires collected were supported by the qualitative findings of interviews conducted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. Mobile wireless communications have witnessed the adoption of several generations, each of them complementing and improving the former. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. 4G is a collection of technologies and standards that will allow a range of ubiquitous computing and wireless communication architectures. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications from 100 Mbps, in high mobility links, to as high as 1 Gbps for low mobility users, in addition to high efficiency in the spectrum usage. On mobile wireless communications networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations, where a terrestrial infrastructure is unavailable. Thus, they must rely upon satellite coverage. Good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. This technique must adapt to the characteristics of the satellite channel and also be efficient in the use of allocated bandwidth. Satellite links are fading channels, when used by mobile users. Some measures designed to approach these fading environments make use of: (1) spatial diversity (two receive antenna configuration); (2) time diversity (channel interleaver/spreading techniques); and (3) upper layer FEC. The author proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. For this purpose, a good channel model is necessary.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

lmage super-resolution is defined as a class of techniques that enhance the spatial resolution of images. Super-resolution methods can be subdivided in single and multi image methods. This thesis focuses on developing algorithms based on mathematical theories for single image super­ resolution problems. lndeed, in arder to estimate an output image, we adopta mixed approach: i.e., we use both a dictionary of patches with sparsity constraints (typical of learning-based methods) and regularization terms (typical of reconstruction-based methods). Although the existing methods already per- form well, they do not take into account the geometry of the data to: regularize the solution, cluster data samples (samples are often clustered using algorithms with the Euclidean distance as a dissimilarity metric), learn dictionaries (they are often learned using PCA or K-SVD). Thus, state-of-the-art methods still suffer from shortcomings. In this work, we proposed three new methods to overcome these deficiencies. First, we developed SE-ASDS (a structure tensor based regularization term) in arder to improve the sharpness of edges. SE-ASDS achieves much better results than many state-of-the- art algorithms. Then, we proposed AGNN and GOC algorithms for determining a local subset of training samples from which a good local model can be computed for recon- structing a given input test sample, where we take into account the underlying geometry of the data. AGNN and GOC methods outperform spectral clustering, soft clustering, and geodesic distance based subset selection in most settings. Next, we proposed aSOB strategy which takes into account the geometry of the data and the dictionary size. The aSOB strategy outperforms both PCA and PGA methods. Finally, we combine all our methods in a unique algorithm, named G2SR. Our proposed G2SR algorithm shows better visual and quantitative results when compared to the results of state-of-the-art methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Visual recognition is a fundamental research topic in computer vision. This dissertation explores datasets, features, learning, and models used for visual recognition. In order to train visual models and evaluate different recognition algorithms, this dissertation develops an approach to collect object image datasets on web pages using an analysis of text around the image and of image appearance. This method exploits established online knowledge resources (Wikipedia pages for text; Flickr and Caltech data sets for images). The resources provide rich text and object appearance information. This dissertation describes results on two datasets. The first is Berg’s collection of 10 animal categories; on this dataset, we significantly outperform previous approaches. On an additional set of 5 categories, experimental results show the effectiveness of the method. Images are represented as features for visual recognition. This dissertation introduces a text-based image feature and demonstrates that it consistently improves performance on hard object classification problems. The feature is built using an auxiliary dataset of images annotated with tags, downloaded from the Internet. Image tags are noisy. The method obtains the text features of an unannotated image from the tags of its k-nearest neighbors in this auxiliary collection. A visual classifier presented with an object viewed under novel circumstances (say, a new viewing direction) must rely on its visual examples. This text feature may not change, because the auxiliary dataset likely contains a similar picture. While the tags associated with images are noisy, they are more stable when appearance changes. The performance of this feature is tested using PASCAL VOC 2006 and 2007 datasets. This feature performs well; it consistently improves the performance of visual object classifiers, and is particularly effective when the training dataset is small. With more and more collected training data, computational cost becomes a bottleneck, especially when training sophisticated classifiers such as kernelized SVM. This dissertation proposes a fast training algorithm called Stochastic Intersection Kernel Machine (SIKMA). This proposed training method will be useful for many vision problems, as it can produce a kernel classifier that is more accurate than a linear classifier, and can be trained on tens of thousands of examples in two minutes. It processes training examples one by one in a sequence, so memory cost is no longer the bottleneck to process large scale datasets. This dissertation applies this approach to train classifiers of Flickr groups with many group training examples. The resulting Flickr group prediction scores can be used to measure image similarity between two images. Experimental results on the Corel dataset and a PASCAL VOC dataset show the learned Flickr features perform better on image matching, retrieval, and classification than conventional visual features. Visual models are usually trained to best separate positive and negative training examples. However, when recognizing a large number of object categories, there may not be enough training examples for most objects, due to the intrinsic long-tailed distribution of objects in the real world. This dissertation proposes an approach to use comparative object similarity. The key insight is that, given a set of object categories which are similar and a set of categories which are dissimilar, a good object model should respond more strongly to examples from similar categories than to examples from dissimilar categories. This dissertation develops a regularized kernel machine algorithm to use this category dependent similarity regularization. Experiments on hundreds of categories show that our method can make significant improvement for categories with few or even no positive examples.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract: The importance of e-government models lies in their offering a basis to measure and guide e-government. There is still no agreement on how to assess a government online. Most of the e-government models are not based on research, nor are they validated. In most countries, e-government has not reached higher stages of growth. Several scholars have shown a confusing picture of e-government. What is lacking is an in-depth analysis of e-government models. Responding to the need for such an analysis, this study identifies the strengths and weaknesses of major national and local e-government evaluation models. The common limitations of most models are focusing on the government and not the citizen, missing qualitative measures, constructing the e-equivalent of a bureaucratic administration, and defining general criteria without sufficient validations. In addition, this study has found that the metrics defined for national e-government are not suitable for municipalities, and most of the existing studies have focused on national e-governments even though local ones are closer to citizens. There is a need for developing a good theoretical model for both national and local municipal e-government.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Chronic wounds are a significant socioeconomic problem for governments worldwide. Approximately 15% of people who suffer from diabetes will experience a lower-limb ulcer at some stage of their lives, and 24% of these wounds will ultimately result in amputation of the lower limb. Hyperbaric Oxygen Therapy (HBOT) has been shown to aid the healing of chronic wounds; however, the causal reasons for the improved healing remain unclear and hence current HBOT protocols remain empirical. Here we develop a three-species mathematical model of wound healing that is used to simulate the application of hyperbaric oxygen therapy in the treatment of wounds. Based on our modelling, we predict that intermittent HBOT will assist chronic wound healing while normobaric oxygen is ineffective in treating such wounds. Furthermore, treatment should continue until healing is complete, and HBOT will not stimulate healing under all circumstances, leading us to conclude that finding the right protocol for an individual patient is crucial if HBOT is to be effective. We provide constraints that depend on the model parameters for the range of HBOT protocols that will stimulate healing. More specifically, we predict that patients with a poor arterial supply of oxygen, high consumption of oxygen by the wound tissue, chronically hypoxic wounds, and/or a dysfunctional endothelial cell response to oxygen are at risk of nonresponsiveness to HBOT. The work of this paper can, in some way, highlight which patients are most likely to respond well to HBOT (for example, those with a good arterial supply), and thus has the potential to assist in improving both the success rate and hence the costeffectiveness of this therapy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The proton radioactivity half-lives of spherical proton emitters are calculated by the cluster model with the contribution of a centrifugal potential barrier considered separately. The results are compared with the experimental data and other theoretical data, and good agreement is found for most nuclei. In addition, two formulae are proposed for the proton decay half-life of spherical proton emitters through the least squares fit to the experimental data available, and could reproduce the experimental half-lives successfully.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Theoretical alpha-decay half-lives of the heaviest nuclei are calculated using the experimental Q value. The barriers in the quasi-molecular shape path is determined within a Generalized Liquid Drop Model (GLDM) and the WKB approximation is used. The results are compared with calculations using the Density-Dependent, M3Y (DDM3Y) effective interaction and the Viola-Seaborg-Sobiczewski (VSS) formulae. The calculations provide consistent estimates for the half-lives of the a decay chains of these superheavy elements. The experimental data stand between the GLDM calculations and VSS ones in the most time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The alpha decay half-lives of the recently produced isotopes of the 112, 114, 116 and 118 nuclei and decay products have been calculated in the quasi-molecular shape path using the experimental Q(alpha) value and a Generalized Liquid Drop Model including the proximity effects between nucleons in the neck or the gap between the nascent fragments. Reasonable estimates are obtained for the observed alpha decay half-lives. The results are compared with calculations using the Density-Dependent M3Y effective interaction and the Viola-Seaborg-Sobiczewski formulae. Generalized Liquid Drop Model predictions are provided for the alpha decay half-lives of other superheavy nuclei using the Finite Range Droplet Model Q(alpha) and compared with the values derived from the VSS formulae.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The lifetimes of alpha decays of the recently produced isotopes of the elements 112, 114, 116 and the element (294)118 and of some decay products have been calculated theoretically within the Wentzel-Kramers-Brillouin approximation. The alpha decay barriers have been determined in the quasimolecular shape path within a generalized liquid drop model including the proximity effects between nuclei in a neck, the mass and charge asymmetry and the precise nuclear radius. These calculations provide reasonable estimated for the observed alpha decay lifetimes. The calculated results have been compared with the results of the density-dependent M3Y effective interaction and the experimental data. It is indicated that the theoretical foundation of the generalized liquid drop model is as good as that of the microscopic DDM3Y model, at least in the sense of predicting the T-1/2 values as long as one uses a correct alpha decay energy. The half lives of these new nuclei are well tested from the consistence of the macroscopic, the microscopic and the experimental data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The alpha-decay half-lives of recently synthesized superheavy nuclei (SHN) are investigated by employing a unified fission model (UFM) where a new method to calculate the assault frequency of alpha emission is used. The excellent agreement with the experimental data indicates the UFM is a useful tool to investigate these alpha decays. It is found that the alpha-decay half-lives become more and more insensitive to the Q(alpha) values as the atomic number increases on the whole, which is favorable for us to predict the half-lives of SHN. In addition, a formula is proposed to compute the Q(alpha) values for the nuclei with Z >= 92 and N >= 140 with a good accuracy, according to which the long-lived SHN should be neutron rich. Several weeks ago, two isotopes of a new element with atomic number Z = 117 were synthesized and their alpha-decay chains have been observed. The Q(alpha) formula is found to work well for these nuclei, confirming its predictive power. The experimental half-lives are well reproduced by employing the UFM with the experimental Q(alpha) values. This fact that the experimental half-lives are compatible with experimental Q(alpha) values supports the synthesis of a new element 117 and the experimental measurements to a certain extent.