167 resultados para Compactification and String Models
Resumo:
Concrete is commonly used as a primary construction material for tall building construction. Load bearing components such as columns and walls in concrete buildings are subjected to instantaneous and long term axial shortening caused by the time dependent effects of "shrinkage", "creep" and "elastic" deformations. Reinforcing steel content, variable concrete modulus, volume to surface area ratio of the elements and environmental conditions govern axial shortening. The impact of differential axial shortening among columns and core shear walls escalate with increasing building height. Differential axial shortening of gravity loaded elements in geometrically complex and irregular buildings result in permanent distortion and deflection of the structural frame which have a significant impact on building envelopes, building services, secondary systems and the life time serviceability and performance of a building. Existing numerical methods commonly used in design to quantify axial shortening are mainly based on elastic analytical techniques and therefore unable to capture the complexity of non-linear time dependent effect. Ambient measurements of axial shortening using vibrating wire, external mechanical strain, and electronic strain gauges are methods that are available to verify pre-estimated values from the design stage. Installing these gauges permanently embedded in or on the surface of concrete components for continuous measurements during and after construction with adequate protection is uneconomical, inconvenient and unreliable. Therefore such methods are rarely if ever used in actual practice of building construction. This research project has developed a rigorous numerical procedure that encompasses linear and non-linear time dependent phenomena for prediction of axial shortening of reinforced concrete structural components at design stage. This procedure takes into consideration (i) construction sequence, (ii) time varying values of Young's Modulus of reinforced concrete and (iii) creep and shrinkage models that account for variability resulting from environmental effects. The capabilities of the procedure are illustrated through examples. In order to update previous predictions of axial shortening during the construction and service stages of the building, this research has also developed a vibration based procedure using ambient measurements. This procedure takes into consideration the changes in vibration characteristic of structure during and after construction. The application of this procedure is illustrated through numerical examples which also highlight the features. The vibration based procedure can also be used as a tool to assess structural health/performance of key structural components in the building during construction and service life.
Resumo:
Over the past decade our understanding of foot function has increased significantly[1,2]. Our understanding of foot and ankle biomechanics appears to be directly correlated to advances in models used to assess and quantify kinematic parameters in gait. These advances in models in turn lead to greater detail in the data. However, we must consider that the level of complexity is determined by the question or task being analysed. This systematic review aims to provide a critical appraisal of commonly used marker sets and foot models to assess foot and ankle kinematics in a wide variety of clinical and research purposes.
Resumo:
Digital human modelling (DHM) has today matured from research into industrial application. In the automotive domain, DHM has become a commonly used tool in virtual prototyping and human-centred product design. While this generation of DHM supports the ergonomic evaluation of new vehicle design during early design stages of the product, by modelling anthropometry, posture, motion or predicting discomfort, the future of DHM will be dominated by CAE methods, realistic 3D design, and musculoskeletal and soft tissue modelling down to the micro-scale of molecular activity within single muscle fibres. As a driving force for DHM development, the automotive industry has traditionally used human models in the manufacturing sector (production ergonomics, e.g. assembly) and the engineering sector (product ergonomics, e.g. safety, packaging). In product ergonomics applications, DHM share many common characteristics, creating a unique subset of DHM. These models are optimised for a seated posture, interface to a vehicle seat through standardised methods and provide linkages to vehicle controls. As a tool, they need to interface with other analytic instruments and integrate into complex CAD/CAE environments. Important aspects of current DHM research are functional analysis, model integration and task simulation. Digital (virtual, analytic) prototypes or digital mock-ups (DMU) provide expanded support for testing and verification and consider task-dependent performance and motion. Beyond rigid body mechanics, soft tissue modelling is evolving to become standard in future DHM. When addressing advanced issues beyond the physical domain, for example anthropometry and biomechanics, modelling of human behaviours and skills is also integrated into DHM. Latest developments include a more comprehensive approach through implementing perceptual, cognitive and performance models, representing human behaviour on a non-physiologic level. Through integration of algorithms from the artificial intelligence domain, a vision of the virtual human is emerging.
Resumo:
As organizations reach higher levels of business process management maturity, they often find themselves maintaining very large process model repositories, representing valuable knowledge about their operations. A common practice within these repositories is to create new process models, or extend existing ones, by copying and merging fragments from other models. We contend that if these duplicate fragments, a.k.a. ex- act clones, can be identified and factored out as shared subprocesses, the repository’s maintainability can be greatly improved. With this purpose in mind, we propose an indexing structure to support fast detection of clones in process model repositories. Moreover, we show how this index can be used to efficiently query a process model repository for fragments. This index, called RPSDAG, is based on a novel combination of a method for process model decomposition (namely the Refined Process Structure Tree), with established graph canonization and string matching techniques. We evaluated the RPSDAG with large process model repositories from industrial practice. The experiments show that a significant number of non-trivial clones can be efficiently found in such repositories, and that fragment queries can be handled efficiently.
Resumo:
The Time magazine ‘Person of theYear’ award is a venerable institution. Established by Time’s founder Henry Luce in 1927 as ‘Man of the Year’, it is an annual award given to ‘a person, couple, group, idea, place, or machine that ‘for better or for worse ... has done the most to influence the events of the year’ (Time 2002, p. 1). In 2010, the award was given to Mark Zuckerberg, the founder and CEO of the social networking site Facebook.There was, however, a strong campaign for the ‘People’s Choice’ award to be given to Julian Assange, the founder and editor-in-chief of Wikileaks, the online whistleblowing site. Earlier in the year Wikileaks had released more than 250 000 US government diplomatic cables through the internet, and the subsequent controver- sies around the actions of Wikileaks and Assange came to be known worldwide as ‘Cablegate’. The focus of this chapter is not on the implications of ‘Cablegate’ for international diplomacy, which continue to have great significance, but rather upon what the emergence of Wikileaks has meant for journalism, and whether it provides insights into the future of journalism. Both Facebook and Wikileaks, as well as social media platforms such as Twitter and YouTube, and independent media practices such as blogging, citizen journalism and crowdsourcing, are manifestations of the rise of social media, or what has also been termed web 2.0.The term ‘web 2.0’ was coined by Tim O’Reilly, and captures the rise of online social media platforms and services, that better realise the collaborative potential of digitally networked media. They do this by moving from the relatively static, top-down notions of interactivity that informed early internet development, towards more open and evolutionary models that better harness collective intelligence by enabling users to become the creators and collaborators in the development of online media content (Musser and O’Reilly 2007; Bruns 2008).
Resumo:
The establishment by the Prime Minister of the Community Business Partnerships Board along with recent taxation reform has drawn attention to corporate philanthropy in Australia. Definitions and models are needed as each of the potential partners – government, corporations and nonprofit organisations – attempts to come to grips with opportunities. The intending partners will need to determine their responsibilities and desired outcomes so that they may work effectively towards mutually beneficial working relationships. Performance indicators need to be determined, benchmarks developed and best practice promoted. A dearth of research exists in this area (Burch, 1998; Industry Commission Report, 1995; Lyons & Hocking, 1998). More exhaustive research, collection and analysis of appropriate data will aid the process. This particular research indicates a lack of understanding between corporations and nonprofit organisations. There are risks inherent in the proposed partnerships, such as inability to reach agreement, potential for increased costs, and failure to deliver by one of the partners. This paper assesses opportunities and risks, suggests topics for high level debate, and indicates models for the development of partnerships.
Resumo:
In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.
Resumo:
Ratites are large, flightless birds and include the ostrich, rheas, kiwi, emu, and cassowaries, along with extinct members, such as moa and elephant birds. Previous phylogenetic analyses of complete mitochondrial genome sequences have reinforced the traditional belief that ratites are monophyletic and tinamous are their sister group. However, in these studies ratite monophyly was enforced in the analyses that modeled rate heterogeneity among variable sites. Relaxing this topological constraint results in strong support for the tinamous (which fly) nesting within ratites. Furthermore, upon reducing base compositional bias and partitioning models of sequence evolution among protein codon positions and RNA structures, the tinamou–moa clade grouped with kiwi, emu, and cassowaries to the exclusion of the successively more divergent rheas and ostrich. These relationships are consistent with recent results from a large nuclear data set, whereas our strongly supported finding of a tinamou–moa grouping further resolves palaeognath phylogeny. We infer flight to have been lost among ratites multiple times in temporally close association with the Cretaceous–Tertiary extinction event. This circumvents requirements for transient microcontinents and island chains to explain discordance between ratite phylogeny and patterns of continental breakup. Ostriches may have dispersed to Africa from Eurasia, putting in question the status of ratites as an iconic Gondwanan relict taxon. [Base composition; flightless; Gondwana; mitochondrial genome; Palaeognathae; phylogeny; ratites.]
Resumo:
The combined impact of social class, cultural background and experience upon early literacy achievement in the first year of schooling is among the most durable questions in educational research. Links have been established between social class and achievement but literacy involves complex social and cognitive practices that are not necessarily reflected in the connections that have been made. The complexity of relationships between social class, cultural background and experience, and their impact on early literacy achievement have received little research attention. Recent refinements of the broad terms of social class or socioeconomic status have questioned the established links between social class and achievement. Nevertheless, it remains difficult to move beyond deficit and mismatch models of explaining and understanding the underperformance of children from lower socioeconomic and cultural minority groups when conventional measures are used. The data from an Australian pilot study reported here add to the increasing evidence that income is not necessarily related directly to home literacy resources or to how those resources are used. Further, the data show that the level of print resources in the home may not be a good indicator of the level of use of those resources.
Resumo:
Programmed cell death (PCD) and progenitor cell generation (of glial and in some brain areas also neuronal fate) in the CNS is an active process throughout life and is generally not associated with gliosis which means that PCD can be pathologically silent. The striking discovery that progenitor cell generation (of glial and in some brain areas neuronal fate) is widespread in the adult CNS (especially the hippocampus) suggest a much more dynamic scenario than previously thought and transcends the dichotomy between neurodevelopmental and neurodegenerative models of schizophrenia and related disorders. We suggest that the regulatory processes that control the regulation of PCD and the generation of progenitor cells may be disturbed in the early phase of psychotic disorders underpinning a disconnectivity syndrom at the onset of clinically overt disorders. An ongoing 1H-MRS study of the anterior hippocampus at 3 Tesla in mostly drug-naive first-episode psychosis patients suggests no change in NAA, but significant increases in myo-inositol and lactate. The data suggests that neuronal integrity in the anterior hippocampus is still intact at the early stage of illness or mainly only functionally impaired. However the increase in lactate and myo-inositol may reflect a potential disturbance of generation and PCD of progenitor cells (of glial and in selected brain areas also neuronal fate) at the onset of psychosis. If true the use of neuroprotective agents such as lithium or eicosapentaenoic acid (which inhibit PCD and support cell generation)in the early phase of psychotic disorders may be a potent treatment avenue to explore.
Resumo:
An important issue facing Canadians today is crime control and prevention. Research done in the late 1980s and early 1990s by three sociologists shows that Canadian federal criminal justice policies and practices adopted by the Mulroney government from 1984 to 1990 were inconsistent with US ‘law and order’ models in place at that time. However, since the mid‐1990s, Canadian federal and provincial governments have mimicked some US authoritarian and gender‐blind means of curbing crime. The main objective of this paper is to provide some key examples of criminal justice policy transfer from the USA in Canada. At first glance, Canada may appear to be a ‘kinder, gentler nation,’ but not to the extent assumed by many, if not most, outside observers.
Resumo:
Kaolinite:NaCl intercalates with basal layer dimensions of 0.95 and 1.25 nm have been prepared by direct reaction of saturated aqueous NaCl solution with well-crystallized source clay KGa-1. The intercalates and their thermal decomposition products have been studied by XRD, solid-state 23Na, 27Al, and 29Si MAS NMR, and FTIR. Intercalate yield is enhanced by dry grinding of kaolinite with NaCl prior to intercalation. The layered structure survives dehydroxylation of the kaolinite at 500°–600°C and persists to above 800°C with a resultant tetrahedral aluminosilicate framework. Excess NaCl can be readily removed by rinsing with water, producing an XRD ‘amorphous’ material. Upon heating at 900°C this material converts to a well-crystallized framework aluminosilicate closely related to low-camegieite, NaAlSiO4, some 350°C below its stability field. Reaction mechanisms are discussed and structural models proposed for each of these novel materials.
Resumo:
Asset management (AM) processes play an important role in assisting enterprises to manage their assets more efficiently. To visualise and improve AM processes, the processes need to be modelled using certain process modelling methodologies. Understanding the requirements for AM process modelling is essential for selecting or developing effective AM process modelling methodologies. However, little research has been done on analysing the requirements. This paper attempts to fill this gap by investigating the features of AM processes. It is concluded that AM process modelling requires intuitive representation of its processes, ‘fast’ implementation of the process modelling, effective evaluation of the processes and sound system integration.
Resumo:
The ability to forecast machinery health is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models which attempt to forecast machinery health based on condition data such as vibration measurements. This paper demonstrates how the population characteristics and condition monitoring data (both complete and suspended) of historical items can be integrated for training an intelligent agent to predict asset health multiple steps ahead. The model consists of a feed-forward neural network whose training targets are asset survival probabilities estimated using a variation of the Kaplan–Meier estimator and a degradation-based failure probability density function estimator. The trained network is capable of estimating the future survival probabilities when a series of asset condition readings are inputted. The output survival probabilities collectively form an estimated survival curve. Pump data from a pulp and paper mill were used for model validation and comparison. The results indicate that the proposed model can predict more accurately as well as further ahead than similar models which neglect population characteristics and suspended data. This work presents a compelling concept for longer-range fault prognosis utilising available information more fully and accurately.
Resumo:
Transition between epithelial and mesenchymal states is a feature of both normal development and tumor progression. We report that expression of chloride channel accessory protein hCLCA2 is a characteristic of epithelial differentiation in the immortalized MCF10A and HMLE models, while induction of epithelial-to-mesenchymal transition by cell dilution, TGFβ or mesenchymal transcription factors sharply reduces hCLCA2 levels. Attenuation of hCLCA2 expression by lentiviral small hairpin RNA caused cell overgrowth and focus formation, enhanced migration and invasion, and increased mammosphere formation in methylcellulose. These changes were accompanied by downregulation of E-cadherin and upregulation of mesenchymal markers such as vimentin and fibronectin. Moreover, hCLCA2 expression is greatly downregulated in breast cancer cells with a mesenchymal or claudin-low profile. These observations suggest that loss of hCLCA2 may promote metastasis. We find that higher-than-median expression of hCLCA2 is associated with a one-third lower rate of metastasis over an 18-year period among breast cancer patients compared with lower-than-median (n=344, unfiltered for subtype). Thus, hCLCA2 is required for epithelial differentiation, and its loss during tumor progression contributes to metastasis. Overexpression of hCLCA2 has been reported to inhibit cell proliferation and is accompanied by increases in chloride current at the plasma membrane and reduced intracellular pH (pHi). We found that knockdown cells have sharply reduced chloride current and higher pHi, both characteristics of tumor cells. These results suggest a mechanism for the effects on differentiation. Loss of hCLCA2 may allow escape from pHi homeostatic mechanisms, permitting the higher intracellular and lower extracellular pH that are characteristic of aggressive tumor cells.