945 resultados para Advanced Transaction Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The aim was to investigate whether there was an association between periodontitis or tooth loss in a homogeneous group of 60-70-year-old Western European men and either a sustained high or low level of C-reactive protein (CRP).
Material and Methods: Men enrolled in a cohort study of cardiovascular disease in Northern Ireland were screened in 1990-1994 and rescreened in 2001-2004, when a periodontal examination was completed. High-sensitivity CRP was measured from fasting blood samples. There were 806 men with six or more teeth who had either a high level (>3 mg/l) or a lower level of CRP at both time points. Multivariate analysis was carried out using logistic regression with adjustment for possible confounders. Models were constructed with the CRP level as the outcome variable and various measures of periodontal status (low and high threshold periodontitis) or tooth loss as predictor variables. Confounders included in the analysis were known cardiovascular risk factors of age, smoking, diabetes, BMI and socioeconomic status.
Results: There were 67 men who had a high value of CRP (>3 mg/l) and 739 men who had a CRP value =3 mg/l at both time points. The unadjusted odds ratio (OR) for advanced periodontitis to be associated with high CRP was 3.62, p=0.0003. The association was somewhat attenuated but remained significant (OR=2.49, p=0.02) after adjustment for confounders. A high level of tooth loss was also associated with high CRP with an adjusted OR of 2.17, p=0.008. Low threshold periodontitis was not associated with the level of CRP.
Conclusion: There was an association between advanced periodontitis and elevated CRP levels as measured at two time points at a 10-year interval in the 60-70-year-old European males investigated. This association was adjusted for various cardiovascular risk factors. There was also an association between high levels of tooth loss and high CRP in the men studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of wideband network services and the new network infrastructures to support them have placed much more requirements on current network management systems. Issues such as scalability, integrity and interoperability have become more important. Existing management systems are not flexible enough to support the provision of Quality of Service (QoS) in these dynamic environments. The concept of Programmable Networks has been proposed to address these requirements. Within this framework, CORBA is regarded as a middleware technology that can enable interoperation among the distributed entities founds in Programmable Networks. By using the basic CORBA environment in a heterogeneous network environment, a network manager is able to control remote Network Elements (NEs) in the same way it controls its local resources. Using this approach both the flexibility and intelligence of the overall network management can be improved. This paper proposes the use of two advanced features of CORBA to enhance the QoS management in a Programmable Network environment. The Transaction Service can be used to manage a set of tasks, whenever the management of elements in a network is correlated; and the Concurrency Service can be used to coordinate multiple accesses on the same network resources. It is also shown in this paper that proper use of CORBA can largely reduce the development and administration of network management applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diabetes Mellitus (DM) has been found to have subtle yet profound effects on the metabolic status of the testis, the expression of numerous spermatogenic genes and is associated with increased numbers of sperm with nuclear DNA damage. The precise mechanism causing these detrimental effects remains unknown. The presence of increased levels of the most prominent member (carboxymethyllysine - CML) of the advanced glycation end product adducts and their receptor (RAGE) in the reproductive tract of DM men has provided a new avenue for research. As there are suspicions that the antibiotic (streptozotocin - STZ) employed to induce DM is also capable of causing oxidative stress and DNA damage, we compared CML and RAGE levels in the reproductive tract and sperm nDNA status of STZ mice with the levels in the Ins(2Akita) mouse to determine which more closely mimics the situation described in the human diabetic. CML was observed in the testes, epididymes and sperm of all animals. Sperm from DM mice showed particularly strong CML immunolocalization in the acrosomal cap, the equatorial region and whenever present, cytoplasmic droplets. Although increased, the level of CML on the sperm of the STZ and Ins(2Akita) DM mice did not reach statistical significance. RAGE was present on the developing acrosome and epididymal sperm of all animals and in discrete regions of the epididymes of the DM models. Only the epididymal sperm of the Ins(2Akita) mice were found to have significantly increased (p < 0.0001) nDNA damage. The Ins(2Akita) mouse therefore appears to more accurately reflect the conditions found in the human and, as such, is a more representative model for the study of diabetes and glycation's influence on male fertility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Customs are generally perceived as a time-consuming impediment to international trade. However, few studies have empirically examined the determinants and the impact of this type of government-imposed transaction costs. This paper analyses the role of firm size as a determinant of customs-related transaction costs, as well as the effect of firm size on the relationship between these costs and the international trade intensity of firms. The results of this study indicate that customs-related transaction costs repress international trade activities of firms, even at low levels of these costs. The paper identifies transaction-related economies of scale, simplified customs procedures and advanced information and communication technology as main determinants of customs-related transaction costs. It is shown that when these factors are taken into account, firm size has no effect on customs-related transaction costs. Policy implications are considered for firm strategy and public policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human body is an extremely challenging environment for the operation of wireless communications systems, not least because of the complex antenna-body electromagnetic interaction effects which can occur. This is further compounded by the impact of movement and the propagation characteristics of the local environment which all have an effect upon body centric communications channels. As the successful design of body area networks (BANs) and other types of body centric system is inextricably linked to a thorough understanding of these factors, the aim of this paper is to conduct a survey of the current state of the art in relation to propagation and channel models primarily for BANs but also considering other types of body centric communications. We initially discuss some of the standardization efforts performed by the Institute of Electrical and Electronics Engineers 802.15.6 task group before focusing on the two most popular types of technologies currently being considered for BANs, namely narrowband and Ultrawideband (UWB) communications. For narrowband communications the applicability of a generic path loss model is contended, before presenting some of the scenario specific models which have proven successful. The impacts of human body shadowing and small-scale fading are also presented alongside some of the most recent research into the Doppler and time dependencies of BANs. For UWB BAN communications, we again consider the path loss as well as empirical tap delay line models developed from a number of extensive channel measurement campaigns conducted by research institutions around the world. Ongoing efforts within collaborative projects such as Committee on Science and Technology Action IC1004 are also described. Finally, recent years have also seen significant developments in other areas of body centric communications such as off-body and body-to-body communications. We highlight some of the newest relevant research in these areas as well as discussing some of the advanced topics which are currently being addressed in the field of body centric communications. Key Points Channel models for body centric comms ©2014. The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Around 10-15% of patients with locally advanced rectal cancer (LARC) undergo a pathologically complete response (TRG4) to neoadjuvant chemoradiotherapy; the rest of patients exhibit a spectrum of tumour regression (TRG1-3). Understanding therapy-related genomic alterations may help us to identify underlying biology or novel targets associated with response that could increase the efficacy of therapy in patients that do not benefit from the current standard of care.
Methods: 48 FFPE rectal cancer biopsies and matched resections were analysed using the WG-DASL HumanHT-12_v4 Beadchip array on the illumina iScan. Bioinformatic analysis was conducted in Partek genomics suite and R studio. Limma and glmnet packages were used to identify genes differentially expressed between tumour regression grades. Validation of microarray results will be carried out using IHC, RNAscope and RT-PCR.
Results: Immune response genes were observed from supervised analysis of the biopsies which may have predictive value. Differential gene expression from the resections as well as pre and post therapy analysis revealed induction of genes in a tumour regression dependent manner. Pathway mapping and Gene Ontology analysis of these genes suggested antigen processing and natural killer mediated cytotoxicity respectively. The natural killer-like gene signature was switched off in non-responders and on in the responders. IHC has confirmed the presence of Natural killer cells through CD56+ staining.
Conclusion: Identification of NK cell genes and CD56+ cells in patients responding to neoadjuvant chemoradiotherapy warrants further investigation into their association with tumour regression grade in LARC. NK cells are known to lyse malignant cells and determining whether their presence is a cause or consequence of response is crucial. Interrogation of the cytokines upregulated in our NK-like signature will help guide future in vitro models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the years, the increased search and exchange of information lead to an increase of traffic intensity in todays optical communication networks. Coherent communications, using the amplitude and phase of the signal, reappears as one of the transmission techniques to increase the spectral efficiency and throughput of optical channels. In this context, this work present a study on format conversion of modulated signals using MZI-SOAs, based exclusively on all- optical techniques through wavelength conversion. This approach, when applied in interconnection nodes between optical networks with different bit rates and modulation formats, allow a better efficiency and scalability of the network. We start with an experimental characterization of the static and dynamic properties of the MZI-SOA. Then, we propose a semi-analytical model to describe the evolution of phase and amplitude at the output of the MZI-SOA. The model’s coefficients are obtained using a multi-objective genetic algorithm. We validate the model experimentally, by exploring the dependency of the optical signal with the operational parameters of the MZI-SOA. We also propose an all-optical technique for the conversion of amplitude modulation signals to a continuous phase modulation format. Finally, we study the potential of MZI-SOAs for the conversion of amplitude signals to QPSK and QAM signals. We show the dependency of the conversion process with the operational parameters deviation from the optimal values. The technique is experimentally validated for QPSK modulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Local level planning requires statistics for small areas, but normally due to cost or logistic constraints, sample surveys are often planned to provide reliable estimates only for large geographical regions and large subgroups of a population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores some questions about the use of models of nursing. These questions make various assumptions about the nature of models of nursing, in general and in particular. Underlying these assumptions are various philosophical positions which are explored through an introduction to postmodernist approaches in philosophical criticism. To illustrate these approaches, a critique of the Roper et al. model is developed, and more general attitudes towards models of nursing are examined. It is suggested that postmodernism offers a challenge to many of the assumptions implicit in models of nursing, and that a greater awareness of these assumptions should lead to nursing care being better informed where such models are in use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neurological disorders are a major concern in modern societies, with increasing prevalence mainly related with the higher life expectancy. Most of the current available therapeutic options can only control and ameliorate the patients’ symptoms, often be-coming refractory over time. Therapeutic breakthroughs and advances have been hampered by the lack of accurate central nervous system (CNS) models. The develop-ment of these models allows the study of the disease onset/progression mechanisms and the preclinical evaluation of novel therapeutics. This has traditionally relied on genetically engineered animal models that often diverge considerably from the human phenotype (developmentally, anatomically and physiologically) and 2D in vitro cell models, which fail to recapitulate the characteristics of the target tissue (cell-cell and cell-matrix interactions, cell polarity). The in vitro recapitulation of CNS phenotypic and functional features requires the implementation of advanced culture strategies that enable to mimic the in vivo struc-tural and molecular complexity. Models based on differentiation of human neural stem cells (hNSC) in 3D cultures have great potential as complementary tools in preclinical research, bridging the gap between human clinical studies and animal models. This thesis aimed at the development of novel human 3D in vitro CNS models by integrat-ing agitation-based culture systems and a wide array of characterization tools. Neural differentiation of hNSC as 3D neurospheres was explored in Chapter 2. Here, it was demonstrated that human midbrain-derived neural progenitor cells from fetal origin (hmNPC) can generate complex tissue-like structures containing functional dopaminergic neurons, as well as astrocytes and oligodendrocytes. Chapter 3 focused on the development of cellular characterization assays for cell aggregates based on light-sheet fluorescence imaging systems, which resulted in increased spatial resolu-tion both for fixed samples or live imaging. The applicability of the developed human 3D cell model for preclinical research was explored in Chapter 4, evaluating the poten-tial of a viral vector candidate for gene therapy. The efficacy and safety of helper-dependent CAV-2 (hd-CAV-2) for gene delivery in human neurons was evaluated, demonstrating increased neuronal tropism, efficient transgene expression and minimal toxicity. The potential of human 3D in vitro CNS models to mimic brain functions was further addressed in Chapter 5. Exploring the use of 13C-labeled substrates and Nucle-ar Magnetic Resonance (NMR) spectroscopy tools, neural metabolic signatures were evaluated showing lineage-specific metabolic specialization and establishment of neu-ron-astrocytic shuttles upon differentiation. Chapter 6 focused on transferring the knowledge and strategies described in the previous chapters for the implementation of a scalable and robust process for the 3D differentiation of hNSC derived from human induced pluripotent stem cells (hiPSC). Here, software-controlled perfusion stirred-tank bioreactors were used as technological system to sustain cell aggregation and dif-ferentiation. The work developed in this thesis provides practical and versatile new in vitro ap-proaches to model the human brain. Furthermore, the culture strategies described herein can be further extended to other sources of neural phenotypes, including pa-tient-derived hiPSC. The combination of this 3D culture strategy with the implemented characterization methods represents a powerful complementary tool applicable in the drug discovery, toxicology and disease modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les systèmes Matériels/Logiciels deviennent indispensables dans tous les aspects de la vie quotidienne. La présence croissante de ces systèmes dans les différents produits et services incite à trouver des méthodes pour les développer efficacement. Mais une conception efficace de ces systèmes est limitée par plusieurs facteurs, certains d'entre eux sont: la complexité croissante des applications, une augmentation de la densité d'intégration, la nature hétérogène des produits et services, la diminution de temps d’accès au marché. Une modélisation transactionnelle (TLM) est considérée comme un paradigme prometteur permettant de gérer la complexité de conception et fournissant des moyens d’exploration et de validation d'alternatives de conception à des niveaux d’abstraction élevés. Cette recherche propose une méthodologie d’expression de temps dans TLM basée sur une analyse de contraintes temporelles. Nous proposons d'utiliser une combinaison de deux paradigmes de développement pour accélérer la conception: le TLM d'une part et une méthodologie d’expression de temps entre différentes transactions d’autre part. Cette synergie nous permet de combiner dans un seul environnement des méthodes de simulation performantes et des méthodes analytiques formelles. Nous avons proposé un nouvel algorithme de vérification temporelle basé sur la procédure de linéarisation des contraintes de type min/max et une technique d'optimisation afin d'améliorer l'efficacité de l'algorithme. Nous avons complété la description mathématique de tous les types de contraintes présentées dans la littérature. Nous avons développé des méthodes d'exploration et raffinement de système de communication qui nous a permis d'utiliser les algorithmes de vérification temporelle à différents niveaux TLM. Comme il existe plusieurs définitions du TLM, dans le cadre de notre recherche, nous avons défini une méthodologie de spécification et simulation pour des systèmes Matériel/Logiciel basée sur le paradigme de TLM. Dans cette méthodologie plusieurs concepts de modélisation peuvent être considérés séparément. Basée sur l'utilisation des technologies modernes de génie logiciel telles que XML, XSLT, XSD, la programmation orientée objet et plusieurs autres fournies par l’environnement .Net, la méthodologie proposée présente une approche qui rend possible une réutilisation des modèles intermédiaires afin de faire face à la contrainte de temps d’accès au marché. Elle fournit une approche générale dans la modélisation du système qui sépare les différents aspects de conception tels que des modèles de calculs utilisés pour décrire le système à des niveaux d’abstraction multiples. En conséquence, dans le modèle du système nous pouvons clairement identifier la fonctionnalité du système sans les détails reliés aux plateformes de développement et ceci mènera à améliorer la "portabilité" du modèle d'application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ma thèse est composée de trois chapitres reliés à l'estimation des modèles espace-état et volatilité stochastique. Dans le première article, nous développons une procédure de lissage de l'état, avec efficacité computationnelle, dans un modèle espace-état linéaire et gaussien. Nous montrons comment exploiter la structure particulière des modèles espace-état pour tirer les états latents efficacement. Nous analysons l'efficacité computationnelle des méthodes basées sur le filtre de Kalman, l'algorithme facteur de Cholesky et notre nouvelle méthode utilisant le compte d'opérations et d'expériences de calcul. Nous montrons que pour de nombreux cas importants, notre méthode est plus efficace. Les gains sont particulièrement grands pour les cas où la dimension des variables observées est grande ou dans les cas où il faut faire des tirages répétés des états pour les mêmes valeurs de paramètres. Comme application, on considère un modèle multivarié de Poisson avec le temps des intensités variables, lequel est utilisé pour analyser le compte de données des transactions sur les marchés financières. Dans le deuxième chapitre, nous proposons une nouvelle technique pour analyser des modèles multivariés à volatilité stochastique. La méthode proposée est basée sur le tirage efficace de la volatilité de son densité conditionnelle sachant les paramètres et les données. Notre méthodologie s'applique aux modèles avec plusieurs types de dépendance dans la coupe transversale. Nous pouvons modeler des matrices de corrélation conditionnelles variant dans le temps en incorporant des facteurs dans l'équation de rendements, où les facteurs sont des processus de volatilité stochastique indépendants. Nous pouvons incorporer des copules pour permettre la dépendance conditionnelle des rendements sachant la volatilité, permettant avoir différent lois marginaux de Student avec des degrés de liberté spécifiques pour capturer l'hétérogénéité des rendements. On tire la volatilité comme un bloc dans la dimension du temps et un à la fois dans la dimension de la coupe transversale. Nous appliquons la méthode introduite par McCausland (2012) pour obtenir une bonne approximation de la distribution conditionnelle à posteriori de la volatilité d'un rendement sachant les volatilités d'autres rendements, les paramètres et les corrélations dynamiques. Le modèle est évalué en utilisant des données réelles pour dix taux de change. Nous rapportons des résultats pour des modèles univariés de volatilité stochastique et deux modèles multivariés. Dans le troisième chapitre, nous évaluons l'information contribuée par des variations de volatilite réalisée à l'évaluation et prévision de la volatilité quand des prix sont mesurés avec et sans erreur. Nous utilisons de modèles de volatilité stochastique. Nous considérons le point de vue d'un investisseur pour qui la volatilité est une variable latent inconnu et la volatilité réalisée est une quantité d'échantillon qui contient des informations sur lui. Nous employons des méthodes bayésiennes de Monte Carlo par chaîne de Markov pour estimer les modèles, qui permettent la formulation, non seulement des densités a posteriori de la volatilité, mais aussi les densités prédictives de la volatilité future. Nous comparons les prévisions de volatilité et les taux de succès des prévisions qui emploient et n'emploient pas l'information contenue dans la volatilité réalisée. Cette approche se distingue de celles existantes dans la littérature empirique en ce sens que ces dernières se limitent le plus souvent à documenter la capacité de la volatilité réalisée à se prévoir à elle-même. Nous présentons des applications empiriques en utilisant les rendements journaliers des indices et de taux de change. Les différents modèles concurrents sont appliqués à la seconde moitié de 2008, une période marquante dans la récente crise financière.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pollution of water with pesticides has become a threat to the man, material and environment. The pesticides released to the environment reach the water bodies through run off. Industrial wastewater from pesticide manufacturing industries contains pesticides at higher concentration and hence a major source of water pollution. Pesticides create a lot of health and environmental hazards which include diseases like cancer, liver and kidney disorders, reproductive disorders, fatal death, birth defects etc. Conventional wastewater treatment plants based on biological treatment are not efficient to remove these compounds to the desired level. Most of the pesticides are phyto-toxic i.e., they kill the microorganism responsible for the degradation and are recalcitrant in nature. Advanced oxidation process (AOP) is a class of oxidation techniques where hydroxyl radicals are employed for oxidation of pollutants. AOPs have the ability to totally mineralise the organic pollutants to CO2 and water. Different methods are employed for the generation of hydroxyl radicals in AOP systems. Acetamiprid is a neonicotinoid insecticide widely used to control sucking type insects on crops such as leafy vegetables, citrus fruits, pome fruits, grapes, cotton, ornamental flowers. It is now recommended as a substitute for organophosphorous pesticides. Since its use is increasing, its presence is increasingly found in the environment. It has high water solubility and is not easily biodegradable. It has the potential to pollute surface and ground waters. Here, the use of AOPs for the removal of acetamiprid from wastewater has been investigated. Five methods were selected for the study based on literature survey and preliminary experiments conducted. Fenton process, UV treatment, UV/ H2O2 process, photo-Fenton and photocatalysis using TiO2 were selected for study. Undoped TiO2 and TiO2 doped with Cu and Fe were prepared by sol-gel method. Characterisation of the prepared catalysts was done by X-ray diffraction, scanning electron microscope, differential thermal analysis and thermogravimetric analysis. Influence of major operating parameters on the removal of acetamiprid has been investigated. All the experiments were designed using central compoiste design (CCD) of response surface methodology (RSM). Model equations were developed for Fenton, UV/ H2O2, photo-Fenton and photocatalysis for predicting acetamiprid removal and total organic carbon (TOC) removal for different operating conditions. Quality of the models were analysed by statistical methods. Experimental validations were also done to confirm the quality of the models. Optimum conditions obtained by experiment were verified with that obtained using response optimiser. Fenton Process is the simplest and oldest AOP where hydrogen peroxide and iron are employed for the generation of hydroxyl radicals. Influence of H2O2 and Fe2+ on the acetamiprid removal and TOC removal by Fenton process were investigated and it was found that removal increases with increase in H2O2 and Fe2+ concentration. At an initial concentration of 50 mg/L acetamiprid, 200 mg/L H2O2 and 20 mg/L Fe2+ at pH 3 was found to be optimum for acetamiprid removal. For UV treatment effect of pH was studied and it was found that pH has not much effect on the removal rate. Addition of H2O2 to UV process increased the removal rate because of the hydroxyl radical formation due to photolyis of H2O2. An H2O2 concentration of 110 mg/L at pH 6 was found to be optimum for acetamiprid removal. With photo-Fenton drastic reduction in the treatment time was observed with 10 times reduction in the amount of reagents required. H2O2 concentration of 20 mg/L and Fe2+ concentration of 2 mg/L was found to be optimum at pH 3. With TiO2 photocatalysis improvement in the removal rate was noticed compared to UV treatment. Effect of Cu and Fe doping on the photocatalytic activity under UV light was studied and it was observed that Cu doping enhanced the removal rate slightly while Fe doping has decreased the removal rate. Maximum acetamiprid removal was observed for an optimum catalyst loading of 1000 mg/L and Cu concentration of 1 wt%. It was noticed that mineralisation efficiency of the processes is low compared to acetamiprid removal efficiency. This may be due to the presence of stable intermediate compounds formed during degradation Kinetic studies were conducted for all the treatment processes and it was found that all processes follow pseudo-first order kinetics. Kinetic constants were found out from the experimental data for all the processes and half lives were calculated. The rate of reaction was in the order, photo- Fenton>UV/ H2O2>Fenton> TiO2 photocatalysis>UV. Operating cost was calculated for the processes and it was found that photo-Fenton removes the acetamiprid at lowest operating cost in lesser time. A kinetic model was developed for photo-Fenton process using the elementary reaction data and mass balance equations for the species involved in the process. Variation of acetamiprid concentration with time for different H2O2 and Fe2+ concentration at pH 3 can be found out using this model. The model was validated by comparing the simulated concentration profiles with that obtained from experiments. This study established the viability of the selected AOPs for the removal of acetamiprid from wastewater. Of the studied AOPs photo- Fenton gives the highest removal efficiency with lowest operating cost within shortest time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The basic premise of transaction-cost theory is that the decision to outsource, rather than to undertake work in-house, is determined by the relative costs incurred in each of these forms of economic organization. In construction the "make or buy" decision invariably leads to a contract. Reducing the costs of entering into a contractual relationship (transaction costs) raises the value of production and is therefore desirable. Commonly applied methods of contractor selection may not minimise the costs of contracting. Research evidence suggests that although competitive tendering typically results in the lowest bidder winning the contract this may not represent the lowest project cost after completion. Multi-parameter and quantitative models for contractor selection have been developed to identify the best (or least risky) among bidders. A major area in which research is still needed is in investigating the impact of different methods of contractor selection on the costs of entering into a contract and the decision to outsource.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article is the second part of a review of the historical evolution of mathematical models applied in the development of building technology. The first part described the current state of the art and contrasted various models with regard to the applications to conventional buildings and intelligent buildings. It concluded that mathematical techniques adopted in neural networks, expert systems, fuzzy logic and genetic models, that can be used to address model uncertainty, are well suited for modelling intelligent buildings. Despite the progress, the possible future development of intelligent buildings based on the current trends implies some potential limitations of these models. This paper attempts to uncover the fundamental limitations inherent in these models and provides some insights into future modelling directions, with special focus on the techniques of semiotics and chaos. Finally, by demonstrating an example of an intelligent building system with the mathematical models that have been developed for such a system, this review addresses the influences of mathematical models as a potential aid in developing intelligent buildings and perhaps even more advanced buildings for the future.