173 resultados para Basic Hypergeometric Functions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasingly, almost everything we do in our daily lives is being influenced by information and communications technologies (ICTs) including the Internet. The task of governance is no exception with an increasing number of national, state, and local governments utilizing ICTs to support government operations, engage citizens, and provide government services. As with other things, the process of governance is now being prefixed with an “e”. E-governance can range from simple Web sites that convey basic information to complex sites that transform the customary ways of delivering all sorts of government services. In this respect local e-government is the form of e-governance that specifically focuses on the online delivery of suitable local services by local authorities. In practice local e-government reflects four dimensions, each one dealing with the functions of government itself. The four are: (a) e-services, the electronic delivery of government information, programs, and services often over the Internet; (b) e-management, the use of information technology to improve the management of government. This might range from streamlining business processes to improving the flow of information within government departments; (c) e-democracy the use of electronic communication vehicles, such as e-mail and the Internet, to increase citizen participation in the public decision-making process; (d) e-commerce, the exchange of money for goods and services over the Internet which might include citizens paying taxes and utility bills, renewing vehicle registrations, and paying for recreation programs, or government buying office supplies and auctioning surplus equipment (Cook, LaVigne, Pagano, Dawes, & Pardo, 2002). Commensurate with the rapid increase in the process of developing e-governance tools, there has been an increased interest in benchmarking the process of local e-governance. This benchmarking, which includes the processes involved in e-governance as well as the extent of e-governance adoption or take-up is important as it allows for improved processes and enables government agencies to move towards world best practice. It is within this context that this article discusses benchmarking local e-government. It brings together a number of discussions regarding the significance of benchmarking, best practices and actions for local e-government, and key elements of a successful local e-government project.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many luxury heritage brands operate on the misconception that heritage is interchangeable with history rather than representative of the emotional response they originally developed in their customer. This idea of heritage as static history inhibits innovation, prevents dynamic renewal and impedes their ability to redefine, strengthen and position their brand in current and emerging marketplaces. This paper examines a number of heritage luxury brands that have successfully identified the original emotional responses they developed in their customers and, through innovative approaches in design, marketing, branding and distribution evoke these responses in contemporary consumers. Using heritage and innovation hand-in-hand, these brands have continued to grow and develop a vision of heritage that incorporates both historical and contemporary ideas to meet emerging customer needs. While what constitutes a ‘luxury’ item is constantly challenged in this era of accessible luxury products, up-scaling and aspirational spending, this paper sees consumers’ emotional needs as the key element in defining the concept of luxury. These emotional qualities consistently remain relevant due to their ability to enhance a positive sense of identity for the brand user. Luxury is about the ‘experience’ not just the product providing the consumer with a sense of enhanced status or identity through invoked feelings of exclusivity, authenticity, quality, uniqueness and culture. This paper will analyse luxury heritage brands that have successfully combined these emotional values with those of their ‘heritage’ to create an aura of authenticity and nostalgia that appeals to contemporary consumers. Like luxury, the line where clothing becomes fashion is blurred in the contemporary fashion industry; however, consumer emotion again plays an important role. For example, clothing becomes ‘fashion’ for consumers when it affects their self perception rather than fulfilling basic functions of shelter and protection. Successful luxury heritage brands can enhance consumers’ sense of self by involving them in the ‘experience’ and ‘personality’ of the brand so they see it as a reflection of their own exclusiveness, authentic uniqueness, belonging and cultural value. Innovation is a valuable tool for heritage luxury brands to successfully generate these desired emotional responses and meet the evolving needs of contemporary consumers. While traditionally fashion has been a monologue from brand to consumer, new technology has given consumers a voice to engage brands in a conversation to express their evolving needs, ideas and feedback. As a result, in this consumer-empowered era of information sharing, this paper defines innovation as the ability of heritage luxury brands to develop new design and branding strategies in response to this consumer feedback while retaining the emotional core values of their heritage. This paper analyses how luxury heritage brands can effectively position themselves in the contemporary marketplace by separating heritage from history to incorporate innovative strategies that will appeal to consumer needs of today and tomorrow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ghrelin axis consists of the gene products of the ghrelin gene (GHRL), and their receptors, including the classical ghrelin receptor GHSR. While it is well-known that the ghrelin gene encodes the 28 amino acid ghrelin peptide hormone, it is now also clear that the locus encodes a range of other bioactive molecules, including novel peptides and non-coding RNAs. For many of these molecules, the physiological functions and cognate receptor(s) remain to be determined. Emerging research techniques, including proteogenomics, are likely to reveal further ghrelin axis-derived molecules. Studies of the role of ghrelin axis genes, peptides and receptors, therefore, promises to be a fruitful area of basic and clinical research in years to come.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal design for generalized linear models has primarily focused on univariate data. Often experiments are performed that have multiple dependent responses described by regression type models, and it is of interest and of value to design the experiment for all these responses. This requires a multivariate distribution underlying a pre-chosen model for the data. Here, we consider the design of experiments for bivariate binary data which are dependent. We explore Copula functions which provide a rich and flexible class of structures to derive joint distributions for bivariate binary data. We present methods for deriving optimal experimental designs for dependent bivariate binary data using Copulas, and demonstrate that, by including the dependence between responses in the design process, more efficient parameter estimates are obtained than by the usual practice of simply designing for a single variable only. Further, we investigate the robustness of designs with respect to initial parameter estimates and Copula function, and also show the performance of compound criteria within this bivariate binary setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assumption that mesenchymal stromal cell (MSC)-based therapies are capable of augmenting physiological regeneration processes has fostered intensive basic and clinical research activities. However, to achieve sustained therapeutic success in vivo, not only the biological, but also the mechanical microenvironment of MSCs during these regeneration processes needs to be taken into account. This is especially important for e.g., bone fracture repair, since MSCs present at the fracture site undergo significant biomechanical stimulation. This study has therefore investigated cellular characteristics and the functional behaviour of MSCs in response to mechanical loading. Our results demonstrated a reduced expression of MSC surface markers CD73 (ecto-5’-nucleotidase) and CD29 (integrin β1) after loading. On the functional level, loading led to a reduced migration of MSCs. Both effects persisted for a week after the removal of the loading stimulus. Specifi c inhibition of CD73/CD29 demonstrated their substrate dependent involvement in MSC migration after loading. These results were supported by scanning electron microscopy images and phalloidin staining of actin fi laments displaying less cell spreading, lamellipodia formation and actin accumulations. Moreover, focal adhesion kinase and Src-family kinases were identified as candidate downstream targets of CD73/CD29 that might contribute to the mechanically induced decrease in MSC migration. These results suggest that MSC migration is controlled by CD73 CD29, which in turn are regulated by mechanical stimulation of cells. We therefore speculate that MSCs migrate into the fracture site, become mechanically entrapped, and thereby accumulate to fulfil their regenerative functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many luxury heritage brands operate on the misconception that heritage is interchangeable with history rather than representative of the emotional response they originally developed in their customer. This idea of heritage as static history inhibits innovation, prevents dynamic renewal and impedes their ability to redefine, strengthen and position their brand in current and emerging marketplaces. This paper examines a number of heritage luxury brands that have successfully identified the original emotional responses they developed in their customers and, through innovative approaches in design, marketing, branding and distribution evoke these responses in contemporary consumers. Using heritage and innovation hand-in-hand, these brands have continued to grow and develop a vision of heritage that incorporates both historical and contemporary ideas to meet emerging customer needs. While what constitutes a ‘luxury’ item is constantly challenged in this era of accessible luxury products, up scaling and aspirational spending, this paper sees consumers’ emotional needs as the key element in defining the concept of luxury. These emotional qualities consistently remain relevant due to their ability to enhance a positive sense of identity for the brand user. Luxury is about the ‘experience’ not just the product providing the consumer with a sense of enhanced status or identity through invoked feelings of exclusivity, authenticity, quality, uniqueness and culture. This paper will analyse luxury heritage brands that have successfully combined these emotional values with those of their ‘heritage’ to create an aura of authenticity and nostalgia that appeals to contemporary consumers. Like luxury, the line where clothing becomes fashion is blurred in the contemporary fashion industry; however, consumer emotion again plays an important role. For example, clothing becomes ‘fashion’ for consumers when it affects their self perception rather than fulfilling basic functions of shelter and protection. Successful luxury heritage brands can enhance consumers’ sense of self by involving them in the ‘experience’ and ‘personality’ of the brand so they see it as a reflection of their own exclusiveness, authentic uniqueness, belonging and cultural value. Innovation is a valuable tool for heritage luxury brands to successfully generate these desired emotional responses and meet the evolving needs of contemporary consumers. While traditionally fashion has been a monologue from brand to consumer, new technology has given consumers a voice to engage brands in a conversation to express their evolving needs, ideas and feedback. As a result, in this consumer-empowered era of information sharing, this paper defines innovation as the ability of heritage luxury brands to develop new design and branding strategies in response to this consumer feedback while retaining the emotional core values of their heritage. This paper analyses how luxury heritage brands can effectively position themselves in the contemporary marketplace by separating heritage from history to incorporate innovative strategies that will appeal to consumer needs of today and tomorrow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, few attempts have been made to explore the structure damage with frequency response functions (FRFs). This paper illustrates the damage identification and condition assessment of a beam structure using a new frequency response functions (FRFs) based damage index and Artificial Neural Networks (ANNs). In practice, usage of all available FRF data as an input to artificial neural networks makes the training and convergence impossible. Therefore one of the data reduction techniques Principal Component Analysis (PCA) is introduced in the algorithm. In the proposed procedure, a large set of FRFs are divided into sub-sets in order to find the damage indices for different frequency points of different damage scenarios. The basic idea of this method is to establish features of damaged structure using FRFs from different measurement points of different sub-sets of intact structure. Then using these features, damage indices of different damage cases of the structure are identified after reconstructing of available FRF data using PCA. The obtained damage indices corresponding to different damage locations and severities are introduced as input variable to developed artificial neural networks. Finally, the effectiveness of the proposed method is illustrated and validated by using the finite element modal of a beam structure. The illustrated results show that the PCA based damage index is suitable and effective for structural damage detection and condition assessment of building structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The health system is one sector dealing with a deluge of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Also, there are many healthcare organisations, which still have stand-alone systems, not integrated for management of information and decision-making. This shows, there is a need for an effective system to capture, collate and distribute this health data. Therefore, implementing the data warehouse concept in healthcare is potentially one of the solutions to integrate health data. Data warehousing has been used to support business intelligence and decision-making in many other sectors such as the engineering, defence and retail sectors. The research problem that is going to be addressed is, "how can data warehousing assist the decision-making process in healthcare". To address this problem the researcher has narrowed an investigation focusing on a cardiac surgery unit. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. The cardiac surgery unit at TPCH uses a stand-alone database of patient clinical data, which supports clinical audit, service management and research functions. However, much of the time, the interaction between the cardiac surgery unit information system with other units is minimal. There is a limited and basic two-way interaction with other clinical and administrative databases at TPCH which support decision-making processes. The aims of this research are to investigate what decision-making issues are faced by the healthcare professionals with the current information systems and how decision-making might be improved within this healthcare setting by implementing an aligned data warehouse model or models. As a part of the research the researcher will propose and develop a suitable data warehouse prototype based on the cardiac surgery unit needs and integrating the Intensive Care Unit database, Clinical Costing unit database (Transition II) and Quality and Safety unit database [electronic discharge summary (e-DS)]. The goal is to improve the current decision-making processes. The main objectives of this research are to improve access to integrated clinical and financial data, providing potentially better information for decision-making for both improved from the questionnaire and by referring to the literature, the results indicate a centralised data warehouse model for the cardiac surgery unit at this stage. A centralised data warehouse model addresses current needs and can also be upgraded to an enterprise wide warehouse model or federated data warehouse model as discussed in the many consulted publications. The data warehouse prototype was able to be developed using SAS enterprise data integration studio 4.2 and the data was analysed using SAS enterprise edition 4.3. In the final stage, the data warehouse prototype was evaluated by collecting feedback from the end users. This was achieved by using output created from the data warehouse prototype as examples of the data desired and possible in a data warehouse environment. According to the feedback collected from the end users, implementation of a data warehouse was seen to be a useful tool to inform management options, provide a more complete representation of factors related to a decision scenario and potentially reduce information product development time. However, there are many constraints exist in this research. For example the technical issues such as data incompatibilities, integration of the cardiac surgery database and e-DS database servers and also, Queensland Health information restrictions (Queensland Health information related policies, patient data confidentiality and ethics requirements), limited availability of support from IT technical staff and time restrictions. These factors have influenced the process for the warehouse model development, necessitating an incremental approach. This highlights the presence of many practical barriers to data warehousing and integration at the clinical service level. Limitations included the use of a small convenience sample of survey respondents, and a single site case report study design. As mentioned previously, the proposed data warehouse is a prototype and was developed using only four database repositories. Despite this constraint, the research demonstrates that by implementing a data warehouse at the service level, decision-making is supported and data quality issues related to access and availability can be reduced, providing many benefits. Output reports produced from the data warehouse prototype demonstrated usefulness for the improvement of decision-making in the management of clinical services, and quality and safety monitoring for better clinical care. However, in the future, the centralised model selected can be upgraded to an enterprise wide architecture by integrating with additional hospital units’ databases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study we set out to dissociate the developmental time course of automatic symbolic number processing and cognitive control functions in grade 1-3 British primary school children. Event-related potential (ERP) and behavioral data were collected in a physical size discrimination numerical Stroop task. Task-irrelevant numerical information was processed automatically already in grade 1. Weakening interference and strengthening facilitation indicated the parallel development of general cognitive control and automatic number processing. Relationships among ERP and behavioral effects suggest that control functions play a larger role in younger children and that automaticity of number processing increases from grade 1 to 3.