915 resultados para Reliable multicast
Resumo:
This paper presents a higher-order beam-column formulation that can capture the geometrically non-linear behaviour of steel framed structures which contain a multiplicity of slender members. Despite advances in computational frame software, analyses of large frames can still be problematic from a numerical standpoint and so the intent of the paper is to fulfil a need for versatile, reliable and efficient non-linear analysis of general steel framed structures with very many members. Following a comprehensive review of numerical frame analysis techniques, a fourth-order element is derived and implemented in an updated Lagrangian formulation, and it is able to predict flexural buckling, snap-through buckling and large displacement post-buckling behaviour of typical structures whose responses have been reported by independent researchers. The solutions are shown to be efficacious in terms of a balance of accuracy and computational expediency. The higher-order element forms a basis for augmenting the geometrically non-linear approach with material non-linearity through the refined plastic hinge methodology described in the companion paper.
Resumo:
Finite element frame analysis programs targeted for design office application necessitate algorithms which can deliver reliable numerical convergence in a practical timeframe with comparable degrees of accuracy, and a highly desirable attribute is the use of a single element per member to reduce computational storage, as well as data preparation and the interpretation of the results. To this end, a higher-order finite element method including geometric non-linearity is addressed in the paper for the analysis of elastic frames for which a single element is used to model each member. The geometric non-linearity in the structure is handled using an updated Lagrangian formulation, which takes the effects of the large translations and rotations that occur at the joints into consideration by accumulating their nodal coordinates. Rigid body movements are eliminated from the local member load-displacement relationship for which the total secant stiffness is formulated for evaluating the large member deformations of an element. The influences of the axial force on the member stiffness and the changes in the member chord length are taken into account using a modified bowing function which is formulated in the total secant stiffness relationship, for which the coupling of the axial strain and flexural bowing is included. The accuracy and efficiency of the technique is verified by comparisons with a number of plane and spatial structures, whose structural response has been reported in independent studies.
Resumo:
Purpose This paper develops and estimates a model to measure consumer perceptions of trade show effectiveness. Design/methodology/approach Data were collected at three separate B2C trade shows. Study 1 (n=47) involved field interviews with data subjected to qualitative item generation and content analysis. Study 2 data (n=147) were subjected to exploratory factor analysis and item-total correlation to identify a preliminary factor structure for the effectiveness construct and to test for reliability. In Study 3 (n=592), confirmatory factor analysis was undertaken to more rigorously test the factor structure and generalise across industries. Validity testing was also performed. Findings A three-dimensional factor structure for assessing consumer visitors’ perceptions of trade show effectiveness was produced incorporating research, operational, and entertainment components. Research limitations/implications Data were collected in Australia and results may not generalise across cultural boundaries. Practical implications The resulting measurement model may be used as a reliable post-hoc diagnostic tool to identify areas of trade show effectiveness where specific performance improvements are needed. Results indicate that exhibitors and organisers of B2C trade shows should consider effectiveness as a multidimensional phenomenon with entertainment, product / industry research, and the facilitation of purchase decision-making processes and problem resolution being key objectives for consumer attendees. These elements of effectiveness should each be addressed by exhibitors and organisers in planning their displays and events. Originality/value This is the first study to provide an empirically valid model for assessing trade show effectiveness from the consumer visitor’s perspective.
Resumo:
Material yielding is typically modeled either by plastic zone or plastic hinge methods under the context of geometric and material nonlinear finite element methods. In fire analysis of steel structures, the plastic zone method is widely used, but it requires extensively more computational efforts. The objective of this paper is to develop the nonlinear material model allowing for interaction of both axial force and bending moment, which relies on the plastic hinge method to achieve numerical efficiency and reduce computational effort. The biggest advantage of the plastic-hinge approach is its computational efficiency and easy verification by the design code formulae of the axial force–moment interaction yield criterion for beam–column members. Further, the method is reliable and robust when used in analysis of practical and large structures. In order to allow for the effect of catenary action, axial thermal expansion is considered in the axial restraint equations. The yield function for material yielding incorporated in the stiffness formulation, which allows for both axial force and bending moment effects, is more accurate and rational to predict the behaviour of the frames under fire. In the present fire analysis, the mechanical properties at elevated temperatures follow mainly the Eurocode 3 [Design of steel structures, Part 1.2: Structural fire design. European Committee for Standisation; 2003]. Example of a tension member at a steady state heating condition is modeled to verify the proposed spring formulation and to compare with results by others. The behaviour of a heated member in a highly redundant structure is also studied by the present approach.
Resumo:
Aims: To compare different methods for identifying alcohol involvement in injury-related emergency department presentation in Queensland youth, and to explore the alcohol terminology used in triage text. Methods: Emergency Department Information System data were provided for patients aged 12-24 years with an injury-related diagnosis code for a 5 year period 2006-2010 presenting to a Queensland emergency department (N=348895). Three approaches were used to estimate alcohol involvement: 1) analysis of coded data, 2) mining of triage text, and 3) estimation using an adaptation of alcohol attributable fractions (AAF). Cases were identified as ‘alcohol-involved’ by code and text, as well as AAF weighted. Results: Around 6.4% of these injury presentations overall had some documentation of alcohol involvement, with higher proportions of alcohol involvement documented for 18-24 year olds, females, indigenous youth, where presentations occurred on a Saturday or Sunday, and where presentations occurred between midnight and 5am. The most common alcohol terms identified for all subgroups were generic alcohol terms (eg. ETOH or alcohol) with almost half of the cases where alcohol involvement was documented having a generic alcohol term recorded in the triage text. Conclusions: Emergency department data is a useful source of information for identification of high risk sub-groups to target intervention opportunities, though it is not a reliable source of data for incidence or trend estimation in its current unstandardised form. Improving the accuracy and consistency of identification, documenting and coding of alcohol-involvement at the point of data capture in the emergency department is the most desirable long term approach to produce a more solid evidence base to support policy and practice in this field.
Resumo:
Talk of Big Data seems to be everywhere. Indeed, the apparently value-free concept of ‘data’ has seen a spectacular broadening of popular interest, shifting from the dry terminology of labcoat-wearing scientists to the buzzword du jour of marketers. In the business world, data is increasingly framed as an economic asset of critical importance, a commodity on a par with scarce natural resources (Backaitis, 2012; Rotella, 2012). It is social media that has most visibly brought the Big Data moment to media and communication studies, and beyond it, to the social sciences and humanities. Social media data is one of the most important areas of the rapidly growing data market (Manovich, 2012; Steele, 2011). Massive valuations are attached to companies that directly collect and profit from social media data, such as Facebook and Twitter, as well as to resellers and analytics companies like Gnip and DataSift. The expectation attached to the business models of these companies is that their privileged access to data and the resulting valuable insights into the minds of consumers and voters will make them irreplaceable in the future. Analysts and consultants argue that advanced statistical techniques will allow the detection of ongoing communicative events (natural disasters, political uprisings) and the reliable prediction of future ones (electoral choices, consumption)...
Resumo:
Scrub typhus is a vector-borne disease carried by the chigger mite. The aetiological agent is the rickettsia Orientia tsutsugamushi, which is endemic to several countries in the Asia-Pacific region, including China [1]. It is also a travel-associated disease [2] and of great importance among military personnel [3], [4]. During the Second World War, scrub typhus was associated with a higher case fatality ratio than any other infectious disease in the China-Burma-India theatre of operations 1,3. Clinical presentation in patients varies from asymptomatic to life-threatening disease [5], including acute hearing loss and multiple organ failure [6], [7]. To date, there is still no effective and reliable human vaccine against scrub typhus and no point-of-care diagnostics available [1].
Resumo:
This study describes the evaluation of a clinical scar scale for our porcine burn scars, which includes scar cosmetic outcome, colour, height and hair, supplemented with reference porcine scar photographs representing each scar outcome and scar colour scores. A total of 72 porcine burn scars at week 6 after burn were rated in vivo and/or on photographs. Good agreements were achieved for both intra-rater reliability (correlation is 0.86-0.98) and inter-rater reliability (ICC=80-85%). The results showed statistically significant correlations for each pair in this clinical scar scale (p<0.01), with the best correlation found between scar cosmetic outcome and scar colour. A multivariate principle components analysis revealed that this clinical scar assessment was highly correlated with scar histology, wound size, and re-epithelialisation data (p<0.001). More severe scars are clinically characterised by darker purple colouration, more elevation, no presence of hair, histologically by thicker scar tissue, thinner remaining normal dermis, are more likely to have worse contraction, and slower re-epithelialisation. This study demonstrates that our clinical scar scale is a reliable, independent and valuable tool for assessing porcine burn outcome and truthfully reflects scar appearance and function. To our knowledge, this is the first study demonstrating a high correlation between clinical scar assessment and scar histology, wound contraction and re-epithelialisation data on porcine burn scars. We believe that the successful use of porcine scar scales is invaluable for assessing potential human burn treatments.
Resumo:
Olfactory ensheathing cells (OECs) play an important role in the continuous regeneration of the primary olfactory nervous system throughout life and for regeneration of olfactory neurons after injury. While it is known that several individual OEC subpopulations with distinct properties exist in different anatomical locations, it remains unclear how these different subpopulations respond to a major injury. We have examined the proliferation of OECs from one distinct location, the peripheral accessory olfactory nervous system, following large-scale injury (bulbectomy) in mice. We used crosses of two transgenic reporter mouse lines, S100ß-DsRed and OMP-ZsGreen, to visualise OECs, and main/accessory olfactory neurons, respectively. We surgically removed one olfactory bulb including the accessory olfactory bulb to induce degeneration, and found that accessory OECs in the nerve bundles that terminate in the accessory olfactory bulb responded by increased proliferation with a peak occurring 2 days after the injury. To label proliferating cells we used the thymidine analogue ethynyl deoxyuridine (EdU) using intranasal delivery instead of intraperitoneal injection. We compared and quantified the number of proliferating cells at different regions at one and four days after EdU labelling by the two different methods and found that intranasal delivery method was as effective as intrapeitoneal injection. We demonstrated that accessory OECs actively respond to widespread degeneration of accessory olfactory axons by proliferating. These results have important implications for selecting the source of OECs for neural regeneration therapies and show that intranasal delivery of EdU is an efficient and reliable method for assessing proliferation of olfactory glia.
Resumo:
Most persistent organic pollutants (POPs) like polychlorinated biphenyls (PCBs), a range of polybrominated diphenyl ethers (PBDEs) and organochlorine pesticides (OCPs) are readily absorbed (via the ingestion and inhalation) and accumulate in fatty tissue, including adipose tissue and human milk [1]. Health effects related to exposure to these chemicals may include neurological effects, altered functioning of the nervous system and/or endocrine disruption [2-4]. The burden of environmental disease is recognized as much higher for children than adults, especially in young children under 5 years of age worldwide [5]. There is increased concern regarding the environmental impact on the health of children who have been disproportionately affected by environmental problems. For example they may be subjected to relatively higher exposure, have greater physiological susceptibility and/or suffer more extreme consequences due to growth [6-9]. It is therefore worthwhile to assess the correlation between burden of disease and exposure to xenobiotic chemical pollutants like POPs. Such assessment may provide guidance for legislative changes regarding chemical bans and give reliable advice to parents including lactating mothers.
Resumo:
MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.
Resumo:
This work considers the problem of building high-fidelity 3D representations of the environment from sensor data acquired by mobile robots. Multi-sensor data fusion allows for more complete and accurate representations, and for more reliable perception, especially when different sensing modalities are used. In this paper, we propose a thorough experimental analysis of the performance of 3D surface reconstruction from laser and mm-wave radar data using Gaussian Process Implicit Surfaces (GPIS), in a realistic field robotics scenario. We first analyse the performance of GPIS using raw laser data alone and raw radar data alone, respectively, with different choices of covariance matrices and different resolutions of the input data. We then evaluate and compare the performance of two different GPIS fusion approaches. The first, state-of-the-art approach directly fuses raw data from laser and radar. The alternative approach proposed in this paper first computes an initial estimate of the surface from each single source of data, and then fuses these two estimates. We show that this method outperforms the state of the art, especially in situations where the sensors react differently to the targets they perceive.
Resumo:
This paper presents an approach to promote the integrity of perception systems for outdoor unmanned ground vehicles (UGV) operating in challenging environmental conditions (presence of dust or smoke). The proposed technique automatically evaluates the consistency of the data provided by two sensing modalities: a 2D laser range finder and a millimetre-wave radar, allowing for perceptual failure mitigation. Experimental results, obtained with a UGV operating in rural environments, and an error analysis validate the approach.
Resumo:
This work aims to contribute to reliability and integrity in perceptual systems of autonomous ground vehicles. Information theoretic based metrics to evaluate the quality of sensor data are proposed and applied to visual and infrared camera images. The contribution of the proposed metrics to the discrimination of challenging conditions is discussed and illustrated with the presence of airborne dust and smoke.
Resumo:
This paper proposes an approach to obtain a localisation that is robust to smoke by exploiting multiple sensing modalities: visual and infrared (IR) cameras. This localisation is based on a state-of-the-art visual SLAM algorithm. First, we show that a reasonably accurate localisation can be obtained in the presence of smoke by using only an IR camera, a sensor that is hardly affected by smoke, contrary to a visual camera (operating in the visible spectrum). Second, we demonstrate that improved results can be obtained by combining the information from the two sensor modalities (visual and IR cameras). Third, we show that by detecting the impact of smoke on the visual images using a data quality metric, we can anticipate and mitigate the degradation in performance of the localisation by discarding the most affected data. The experimental validation presents multiple trajectories estimated by the various methods considered, all thoroughly compared to an accurate dGPS/INS reference.