143 resultados para Hierarchical morphology
Resumo:
The paper investigates a detailed Active Shock Control Bump Design Optimisation on a Natural Laminar Flow (NLF) aerofoil; RAE 5243 to reduce cruise drag at transonic flow conditions using Evolutionary Algorithms (EAs) coupled to a robust design approach. For the uncertainty design parameters, the positions of boundary layer transition (xtr) and the coefficient of lift (Cl) are considered (250 stochastic samples in total). In this paper, two robust design methods are considered; the first approach uses a standard robust design method, which evaluates one design model at 250 stochastic conditions for uncertainty. The second approach is the combination of a standard robust design method and the concept of hierarchical (multi-population) sampling (250, 50, 15) for uncertainty. Numerical results show that the evolutionary optimization method coupled to uncertainty design techniques produces useful and reliable Pareto optimal SCB shapes which have low sensitivity and high aerodynamic performance while having significant total drag reduction. In addition,it also shows the benefit of using hierarchical robust method for detailed uncertainty design optimization.
Resumo:
The semiaquatic platypus and terrestrial echidnas (spiny anteaters) are the only living egg-laying mammals (monotremes). The fossil record has provided few clues as to their origins and the evolution of their ecological specializations; however, recent reassignment of the Early Cretaceous Teinolophos and Steropodon to the platypus lineage implies that platypuses and echidnas diverged >112.5 million years ago, reinforcing the notion of monotremes as living fossils. This placement is based primarily on characters related to a single feature, the enlarged mandibular canal, which supplies blood vessels and dense electrosensory receptors to the platypus bill. Our reevaluation of the morphological data instead groups platypus and echidnas to the exclusion of Teinolophos and Steropodon and suggests that an enlarged mandibular canal is ancestral for monotremes (partly reversed in echidnas, in association with general mandibular reduction). A multigene evaluation of the echidna–platypus divergence using both a relaxed molecular clock and direct fossil calibrations reveals a recent split of 19–48 million years ago. Platypus-like monotremes (Monotrematum) predate this divergence, indicating that echidnas had aquatically foraging ancestors that reinvaded terrestrial ecosystems. This ecological shift and the associated radiation of echidnas represent a recent expansion of niche space despite potential competition from marsupials. Monotremes might have survived the invasion of marsupials into Australasia by exploiting ecological niches in which marsupials are restricted by their reproductive mode. Morphology, ecology, and molecular biology together indicate that Teinolophos and Steropodon are basal monotremes rather than platypus relatives, and that living monotremes are a relatively recent radiation.
Resumo:
Navigational collisions are one of the major safety concerns in many seaports. Despite the extent of recent works done on port navigational safety research, little is known about harbor pilot’s perception of collision risks in port fairways. This paper uses a hierarchical ordered probit model to investigate associations between perceived risks and the geometric and traffic characteristics of fairways and the pilot attributes. Perceived risk data, collected through a risk perception survey conducted among the Singapore port pilots, are used to calibrate the model. Intra-class correlation coefficient justifies use of the hierarchical model in comparison with an ordinary model. Results show higher perceived risks in fairways attached to anchorages, and in those featuring sharper bends and higher traffic operating speeds. Lesser risks are perceived in fairways attached to shoreline and confined waters, and in those with one-way traffic, traffic separation scheme, cardinal marks and isolated danger marks. Risk is also found to be perceived higher in night.
Resumo:
This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.
Resumo:
Most crash severity studies ignored severity correlations between driver-vehicle units involved in the same crashes. Models without accounting for these within-crash correlations will result in biased estimates in the factor effects. This study developed a Bayesian hierarchical binomial logistic model to identify the significant factors affecting the severity level of driver injury and vehicle damage in traffic crashes at signalized intersections. Crash data in Singapore were employed to calibrate the model. Model fitness assessment and comparison using Intra-class Correlation Coefficient (ICC) and Deviance Information Criterion (DIC) ensured the suitability of introducing the crash-level random effects. Crashes occurring in peak time, in good street lighting condition, involving pedestrian injuries are associated with a lower severity, while those in night time, at T/Y type intersections, on right-most lane, and installed with red light camera have larger odds of being severe. Moreover, heavy vehicles have a better resistance on severe crash, while crashes involving two-wheel vehicles, young or aged drivers, and the involvement of offending party are more likely to result in severe injuries.
Resumo:
Motorcycles are overrepresented in road traffic crashes and particularly vulnerable at signalized intersections. The objective of this study is to identify causal factors affecting the motorcycle crashes at both four-legged and T signalized intersections. Treating the data in time-series cross-section panels, this study explores different Hierarchical Poisson models and found that the model allowing autoregressive lag 1 dependent specification in the error term is the most suitable. Results show that the number of lanes at the four-legged signalized intersections significantly increases motorcycle crashes largely because of the higher exposure resulting from higher motorcycle accumulation at the stop line. Furthermore, the presence of a wide median and an uncontrolled left-turn lane at major roadways of four-legged intersections exacerbate this potential hazard. For T signalized intersections, the presence of exclusive right-turn lane at both major and minor roadways and an uncontrolled left-turn lane at major roadways of T intersections increases motorcycle crashes. Motorcycle crashes increase on high-speed roadways because they are more vulnerable and less likely to react in time during conflicts. The presence of red light cameras reduces motorcycle crashes significantly for both four-legged and T intersections. With the red-light camera, motorcycles are less exposed to conflicts because it is observed that they are more disciplined in queuing at the stop line and less likely to jump start at the start of green.
Resumo:
Traditional crash prediction models, such as generalized linear regression models, are incapable of taking into account the multilevel data structure, which extensively exists in crash data. Disregarding the possible within-group correlations can lead to the production of models giving unreliable and biased estimates of unknowns. This study innovatively proposes a -level hierarchy, viz. (Geographic region level – Traffic site level – Traffic crash level – Driver-vehicle unit level – Vehicle-occupant level) Time level, to establish a general form of multilevel data structure in traffic safety analysis. To properly model the potential cross-group heterogeneity due to the multilevel data structure, a framework of Bayesian hierarchical models that explicitly specify multilevel structure and correctly yield parameter estimates is introduced and recommended. The proposed method is illustrated in an individual-severity analysis of intersection crashes using the Singapore crash records. This study proved the importance of accounting for the within-group correlations and demonstrated the flexibilities and effectiveness of the Bayesian hierarchical method in modeling multilevel structure of traffic crash data.
Resumo:
This study proposes a full Bayes (FB) hierarchical modeling approach in traffic crash hotspot identification. The FB approach is able to account for all uncertainties associated with crash risk and various risk factors by estimating a posterior distribution of the site safety on which various ranking criteria could be based. Moreover, by use of hierarchical model specification, FB approach is able to flexibly take into account various heterogeneities of crash occurrence due to spatiotemporal effects on traffic safety. Using Singapore intersection crash data(1997-2006), an empirical evaluate was conducted to compare the proposed FB approach to the state-of-the-art approaches. Results show that the Bayesian hierarchical models with accommodation for site specific effect and serial correlation have better goodness-of-fit than non hierarchical models. Furthermore, all model-based approaches perform significantly better in safety ranking than the naive approach using raw crash count. The FB hierarchical models were found to significantly outperform the standard EB approach in correctly identifying hotspots.
Resumo:
Navigational collisions are a major safety concern in many seaports. Despite the recent advances in port navigational safety research, little is known about harbor pilot’s perception of collision risks in anchorages. This study attempts to model such risks by employing a hierarchical ordered probit model, which is calibrated by using data collected through a risk perception survey conducted on Singapore port pilots. The hierarchical model is found to be useful to account for correlations in risks perceived by individual pilots. Results show higher perceived risks in anchorages attached to intersection, local and international fairway; becoming more critical at night. Lesser risks are perceived in anchorages featuring shoreline in boundary, higher water depth, lower density of stationary ships, cardinal marks and isolated danger marks. Pilotage experience shows a negative effect on perceived risks. This study indicates that hierarchical modeling would be useful for treating correlations in navigational safety data.
Resumo:
Baseline monitoring of groundwater quality aims to characterize the ambient condition of the resource and identify spatial or temporal trends. Sites comprising any baseline monitoring network must be selected to provide a representative perspective of groundwater quality across the aquifer(s) of interest. Hierarchical cluster analysis (HCA) has been used as a means of assessing the representativeness of a groundwater quality monitoring network, using example datasets from New Zealand. HCA allows New Zealand's national and regional monitoring networks to be compared in terms of the number of water-quality categories identified in each network, the hydrochemistry at the centroids of these water-quality categories, the proportions of monitoring sites assigned to each water-quality category, and the range of concentrations for each analyte within each water-quality category. Through the HCA approach, the National Groundwater Monitoring Programme (117 sites) is shown to provide a highly representative perspective of groundwater quality across New Zealand, relative to the amalgamated regional monitoring networks operated by 15 different regional authorities (680 sites have sufficient data for inclusion in HCA). This methodology can be applied to evaluate the representativeness of any subset of monitoring sites taken from a larger network.
Resumo:
The study of urban morphology has become an expanding field of research within the architectural discipline, providing theories to be used as tools in the understanding and design of urban landscapes from the past, the present and into the future. Drawing upon contemporary architectural design theory, this investigation reveals what a sectional analysis of an urban landscape can add to the existing research methods within this field. This paper conducts an enquiry into the use of the section as a tool for urban morphological analysis. Following the methodology of the British school of urban morphology, sections through the urban fabric of the case study city of Brisbane are compared. The results are categorised to depict changes in scale, components and utilisation throughout various timeframes. The key findings illustrate how the section, when read in conjunction with the plan can be used to interpret changes to urban form and the relationship that this has to the quality of the urban environment in the contemporary city.
Resumo:
Secrecy of decryption keys is an important pre-requisite for security of any encryption scheme and compromised private keys must be immediately replaced. \emph{Forward Security (FS)}, introduced to Public Key Encryption (PKE) by Canetti, Halevi, and Katz (Eurocrypt 2003), reduces damage from compromised keys by guaranteeing confidentiality of messages that were encrypted prior to the compromise event. The FS property was also shown to be achievable in (Hierarchical) Identity-Based Encryption (HIBE) by Yao, Fazio, Dodis, and Lysyanskaya (ACM CCS 2004). Yet, for emerging encryption techniques, offering flexible access control to encrypted data, by means of functional relationships between ciphertexts and decryption keys, FS protection was not known to exist.\smallskip In this paper we introduce FS to the powerful setting of \emph{Hierarchical Predicate Encryption (HPE)}, proposed by Okamoto and Takashima (Asiacrypt 2009). Anticipated applications of FS-HPE schemes can be found in searchable encryption and in fully private communication. Considering the dependencies amongst the concepts, our FS-HPE scheme implies forward-secure flavors of Predicate Encryption and (Hierarchical) Attribute-Based Encryption.\smallskip Our FS-HPE scheme guarantees forward security for plaintexts and for attributes that are hidden in HPE ciphertexts. It further allows delegation of decrypting abilities at any point in time, independent of FS time evolution. It realizes zero-inner-product predicates and is proven adaptively secure under standard assumptions. As the ``cross-product" approach taken in FS-HIBE is not directly applicable to the HPE setting, our construction resorts to techniques that are specific to existing HPE schemes and extends them with what can be seen as a reminiscent of binary tree encryption from FS-PKE.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.