927 resultados para Effects-Based Operations
Resumo:
Hot metal carriers (HMCs) are large forklift-type vehicles used to move molten metal in aluminum smelters. This paper reports on field experiments that demonstrate that HMCs can operate autonomously and in particular can use vision as a primary sensor to locate the load of aluminum. We present our complete system but focus on the vision system elements and also detail experiments demonstrating reliable operation of the materials handling task. Two key experiments are described, lasting 2 and 5 h, in which the HMC traveled 15 km in total and handled the load 80 times.
Resumo:
O século XXI introduziu profundas mudanças no espaço onde a atuação militar se desenvolve. Esta mutação, que agora inclui o domínio físico e cognitivo na ação militar, impõe a adoção de novos conceitos de operação e estruturas organizacionais mais ágeis, de forma a fazerem face a um ambiente altamente volátil, imprevisível e complexo. Tal contexto torna as organizações, hoje mais do que nunca, dependentes de informação (e dos sistemas que as geram), e no âmbito das organizações militares, uma capacidade em particular assume, na atualidade, uma preponderância fulcral para o sucesso destas, que se designa por Intelligence, Surveillance & Reconnaissance (ISR). Considerando a complexidade de sistemas, processos e pessoas que envolvem toda esta capacidade, torna-se relevante estudar como a Força Aérea Portuguesa (FAP) está a acomodar este conceito no interior da sua estrutura, uma vez que a sua adaptação requer uma organização da era da informação, onde o trabalho em rede assume particular destaque. A presente investigação analisa formas de estruturas organizacionais contemporâneas e cruza-as com as recomendações da Organização do Tratado do Atlântico Norte (também designada por Aliança), comparando-as posteriormente com a atualidade da FAP. No final, são efetuadas propostas tangíveis, que podem potenciar as capacidades existentes, de onde se destaca a criação de uma matriz de análise quanto à eficiência organizacional, uma nova forma de organização das capacidades residentes no que ao ISR concerne, bem como o modo de potenciar o trabalho em rede com base nos meios existentes. Abstract: The 21st century has caused profound changes in the areas where military action takes place. This mutation, which now includes the physical and cognitive domain in military action, requires the adoption of new concepts of operation and more agile organizational structures in order to cope with a highly volatile, unpredictable and complex environment. Thus, more than ever, this makes the present organizations dependent of information (and the systems that generate them), in the case of military organizations, a particular capability undertakes today a strong impact on the success of military organizations. It is known as Intelligence, Surveillance& Reconnaissance (ISR). Taking into account the complexity of systems, processes and people involving all this capability, it is relevant to study how the Portuguese Air Force (PAF) is accommodating this concept within its structure, since the adaptation requires an organization adapted to the information era, where networking is particularly prominent. This research aims to analyze contemporary forms of organizational structures and cross them with the recommendations of the North Atlantic Treaty Organization (also known as Alliance), later comparing them with today's PAF. At the end of this investigation, some tangible proposals are made which can enhance existing capabilities: we can highlight the creation of an analysis matrix for organizational efficiency, a new form of organization of the resident capabilities in the ISR concerns, as well as the way of enhancing networking, based on existing means.
Resumo:
Executive coaching is a rapidly expanding approach to leadership development which has grown at a rate that warrants extensive examination of its effects (Wasylyshyn, 2003). This thesis has therefore examined both behavioural and psychological effects based on a nine month executive coaching intervention within a large not-for-profit organisation. The intervention was a part of a larger ongoing integrated organisational strategy to create an organisational coaching culture. In order to examine the effectiveness of the nine month executive coaching intervention two studies were conducted. A quantitative study used a pre and post questionnaire to examine leaders and their team members‘ responses before and after the coaching intervention. The research examined leader-empowering behaviours, psychological empowerment, job satisfaction and affective commitment. Significant results were demonstrated from leaders‘ self-reports on leader-empowering behaviours and their team members‘ self-reports revealed a significant flow on effect of psychological empowerment. The second part of the investigation involved a qualitative study which explored the developmental nature of psychological empowerment through executive coaching. The examination dissected psychological empowerment into its widely accepted four facets of meaning, impact, competency and self-determination and investigated, through semi-structured interviews, leaders‘ perspectives of the effect of executive coaching upon them (Spreitzer, 1992). It was discovered that a number of the common practices within executive coaching, including goal-setting, accountability and action-reflection, contributed to the production of outcomes that developed higher levels of psychological empowerment. Careful attention was also given to organisational context and its influence upon the outcomes.
Resumo:
Metallic materials exposed to oxygen-enriched atmospheres – as commonly used in the medical, aerospace, aviation and numerous chemical processing industries – represent a significant fire hazard which must be addressed during design, maintenance and operation. Hence, accurate knowledge of metallic materials flammability is required. Reduced gravity (i.e. space-based) operations present additional unique concerns, where the absence of gravity must also be taken into account. The flammability of metallic materials has historically been quantified using three standardised test methods developed by NASA, ASTM and ISO. These tests typically involve the forceful (promoted) ignition of a test sample (typically a 3.2 mm diameter cylindrical rod) in pressurised oxygen. A test sample is defined as flammable when it undergoes burning that is independent of the ignition process utilised. In the standardised tests, this is indicated by the propagation of burning further than a defined amount, or „burn criterion.. The burn criterion in use at the onset of this project was arbitrarily selected, and did not accurately reflect the length a sample must burn in order to be burning independent of the ignition event and, in some cases, required complete consumption of the test sample for a metallic material to be considered flammable. It has been demonstrated that a) a metallic material.s propensity to support burning is altered by any increase in test sample temperature greater than ~250-300 oC and b) promoted ignition causes an increase in temperature of the test sample in the region closest to the igniter, a region referred to as the Heat Affected Zone (HAZ). If a test sample continues to burn past the HAZ (where the HAZ is defined as the region of the test sample above the igniter that undergoes an increase in temperature of greater than or equal to 250 oC by the end of the ignition event), it is burning independent of the igniter, and should be considered flammable. The extent of the HAZ, therefore, can be used to justify the selection of the burn criterion. A two dimensional mathematical model was developed in order to predict the extent of the HAZ created in a standard test sample by a typical igniter. The model was validated against previous theoretical and experimental work performed in collaboration with NASA, and then used to predict the extent of the HAZ for different metallic materials in several configurations. The extent of HAZ predicted varied significantly, ranging from ~2-27 mm depending on the test sample thermal properties and test conditions (i.e. pressure). The magnitude of the HAZ was found to increase with increasing thermal diffusivity, and decreasing pressure (due to slower ignition times). Based upon the findings of this work, a new burn criterion requiring 30 mm of the test sample to be consumed (from the top of the ignition promoter) was recommended and validated. This new burn criterion was subsequently included in the latest revision of the ASTM G124 and NASA 6001B international test standards that are used to evaluate metallic material flammability in oxygen. These revisions also have the added benefit of enabling the conduct of reduced gravity metallic material flammability testing in strict accordance with the ASTM G124 standard, allowing measurement and comparison of the relative flammability (i.e. Lowest Burn Pressure (LBP), Highest No-Burn Pressure (HNBP) and average Regression Rate of the Melting Interface(RRMI)) of metallic materials in normal and reduced gravity, as well as determination of the applicability of normal gravity test results to reduced gravity use environments. This is important, as currently most space-based applications will typically use normal gravity information in order to qualify systems and/or components for reduced gravity use. This is shown here to be non-conservative for metallic materials which are more flammable in reduced gravity. The flammability of two metallic materials, Inconel® 718 and 316 stainless steel (both commonly used to manufacture components for oxygen service in both terrestrial and space-based systems) was evaluated in normal and reduced gravity using the new ASTM G124-10 test standard. This allowed direct comparison of the flammability of the two metallic materials in normal gravity and reduced gravity respectively. The results of this work clearly show, for the first time, that metallic materials are more flammable in reduced gravity than in normal gravity when testing is conducted as described in the ASTM G124-10 test standard. This was shown to be the case in terms of both higher regression rates (i.e. faster consumption of the test sample – fuel), and burning at lower pressures in reduced gravity. Specifically, it was found that the LBP for 3.2 mm diameter Inconel® 718 and 316 stainless steel test samples decreased by 50% from 3.45 MPa (500 psia) in normal gravity to 1.72 MPa (250 psia) in reduced gravity for the Inconel® 718, and 25% from 3.45 MPa (500 psia) in normal gravity to 2.76 MPa (400 psia) in reduced gravity for the 316 stainless steel. The average RRMI increased by factors of 2.2 (27.2 mm/s in 2.24 MPa (325 psia) oxygen in reduced gravity compared to 12.8 mm/s in 4.48 MPa (650 psia) oxygen in normal gravity) for the Inconel® 718 and 1.6 (15.0 mm/s in 2.76 MPa (400 psia) oxygen in reduced gravity compared to 9.5 mm/s in 5.17 MPa (750 psia) oxygen in normal gravity) for the 316 stainless steel. Reasons for the increased flammability of metallic materials in reduced gravity compared to normal gravity are discussed, based upon the observations made during reduced gravity testing and previous work. Finally, the implications (for fire safety and engineering applications) of these results are presented and discussed, in particular, examining methods for mitigating the risk of a fire in reduced gravity.
Resumo:
Performance based planning is a form of planning regulation that is not well understood and the theoretical advantages of this type of planning are rarely achieved in practice. Normatively, this type of regulation relies on performance standards that are quantifiable and technically based which are designed to manage the effects of development, where performance standards provide certainty in respect of the level of performance and the means of achievement is flexible. Few empirical studies have attempted to examine how performance based planning has been conceptualised and implemented in practice. Existing literature is predominately anecdotal and consultant based (Baker et al. 2006) and has not sought to quantitatively examine how land use has been managed or determine how context influences implementation. The Integrated Planning Act 1997 (IPA) operated as Queensland’s principal planning legislation between March 1998 and December 2009. The IPA prevented Local Governments from prohibiting development or use and the term zone was absent from the legislation. While the IPA did not use the term performance based planning, the system is widely considered to be performance based in practice (e.g. Baker et al. 2006; Steele 2009a, 2009b). However, the degree to which the IPA and the planning system in Queensland is performance based is debated (e.g. Yearbury 1998; England 2004). Four research questions guided the research framework using Queensland as the case study. The questions sought to: determine if there is a common understanding of performance based planning; identify how performance based planning was expressed under the IPA; understand how performance based planning was implemented in plans; and explore the experiences of participants in the planning system. The research developed a performance adoption spectrum. The spectrum describes how performance based planning is implemented, ranging between pure and hybrid interpretations. An ex-post evaluation of seventeen IPA plans sought to determine plan performativity within the conceptual spectrum. Land use was examined from the procedural dimension of performance (Assessment Tables) and the substantive dimension of performance (Codes). A documentary analysis and forty one interviews supplemented the research. The analytical framework considered how context influenced performance based planning, including whether: the location of the local government affected land use management techniques; temporal variation in implementation exists; plan-making guidelines affected implementation; different perceptions of the concept exist; this type of planning applies to a range of spatial scales. Outcomes were viewed as the medium for determining the acceptability of development in Queensland, a significant departure from pure approaches found in the United States. Interviews highlighted the absence of plan-making direction in the IPA, which contributed to the confusion about the intended direction of the planning system and the myth that the IPA would guarantee a performance based system. A hybridised form of performance based planning evolved in Queensland which was dependent on prescriptive land use zones and specification of land use type, with some local governments going to extreme lengths to discourage certain activities in a predetermined manner. Context had varying degrees of influence on plan-making methods. Decision-making was found to be inconsistent and the system created a range of unforeseen consequences including difficulties associated with land valuation, increased development speculation, and the role of planners in court was found to be less critical than in the previous planning system.
Resumo:
Educational stratification has been a difficult subject to deal with having yet no study shown a quantitative measure of it. Using the idea of distribution comparison a measure based on parents’ education is built for the primary schools in Lisbon. Upon the confirmation that Lisbon is stratified, I use the measure of peer effects based on stratification and determine its impact on test scores, concluding that the existence of stratification improves scores of students in schools with more educated parents and decreases scores of students in schools with less educated parents. Moreover, using fixed effects I derive the conclusion that the measure of peers’ characteristics helps explain most of differences among schools.
Resumo:
Photoassociation is a possible route for the formation of chemical bonds. In this process, the binding of colliding atoms can be induced by means of a laser field. Photoassociation has been studied in the ultracold regime and also with temperatures well above millikelvins in the thermal energy domain, which is a situation commonly encountered in the laboratory. A photoassociation mechanism can be envisioned based on the use of infrared pulses to drive a transition from free colliding atoms on the electronic ground state to form a molecule directly on that state. This work takes a step in this direction, investigating the laser-pulse-driven formation of heteronuclear diatomic molecules in a thermal gas of atoms including rotational effects. Based on the assumption of full system controllability, the maximum possible photoassociation yield is deduced. The photoassociation probability is calculated as a function of the laser parameters for different temperatures. Additionally, the photoassociation yield induced by subpicosecond pulses of a priori fixed shape is compared to the maximum possible yield.
Resumo:
One of the challenges for structural engineers during design is considering how the structure will respond to crowd-induced dynamic loading. It has been shown that human occupants of a structure do not simply add mass to the system when considering the overall dynamic response of the system, but interact with it and may induce changes of the dynamic properties from those of the empty structure. This study presents an investigation into the human-structure interaction based on several crowd characteristics and their effect on the dynamic properties of an empty structure. The dynamic properties including frequency, damping, and mode shapes were estimated for a single test structure by means of experimental modal analysis techniques. The same techniques were utilized to estimate the dynamic properties when the test structure was occupied by a crowd with different combinations of size, posture, and distribution. The goal of this study is to isolate the occupant characteristics in order to determine the significance of each to be considered when designing new structures to avoid crowd serviceability issues. The results are presented and summarized based on the level of influence of each characteristic. The posture that produces the most significant effects based on the scope of this research is standing with bent knees with a maximum decrease in frequency of the first mode of the empty structure by 32 percent atthe highest mass ratio. The associated damping also increased 36 times the damping of the empty structure. In addition to the analysis of the experimental data, finite element models and a two degree-of-freedom model were created. These models were used to gain an understanding of the test structure, model a crowd as an equivalent mass, and also to develop a single degree-of-freedom (SDOF) model to best represent a crowd of occupants based on the experimental results. The SDOF models created had an averagefrequency of 5.0 Hz, within the range presented in existing biomechanics research, and combined SDOF systems of the test structure and crowd were able to reproduce the frequency and damping ratios associated with experimental tests. Results of this study confirmed the existence of human-structure interaction andthe inability to simply model a crowd as only additional mass. The two degree-offreedom model determined was able to predict the change in natural frequency and damping ratio for a structure occupied by multiple group sizes in a single posture. These results and model are the preliminary steps in the development of an appropriate methodfor modeling a crowd in combination with a more complex FE model of the empty structure.
Resumo:
Biological agents-like cytokines, monoclonal antibodies and fusion proteins are widely used in anti-inflammatory and tumour therapy. They are highly efficient in certain diseases, but can cause a great variety of adverse side-effects. Based on the peculiar features of biological agents a new classification of these adverse side-effects of biological agents is proposed - related but clearly distinct from the classification of side-effects observed with chemicals and drugs. This classification differentiates five distinct types, namely clinical reactions because of high cytokine levels (type alpha), hypersensitivity because of an immune reaction against the biological agent (beta), immune or cytokine imbalance syndromes (gamma), symptoms because of cross-reactivity (delta) and symptoms not directly affecting the immune system (epsilon). This classification could help to better deal with the clinical features of these side-effects, to identify possible individual and general risk factors and to direct research in this novel area of medicine.
Resumo:
In efficiency studies using the stochastic frontier approach, the main focus is to explain inefficiency in terms of some exogenous variables and computation of marginal effects of each of these determinants. Although inefficiency is estimated by its mean conditional on the composed error term (the Jondrow et al., 1982 estimator), the marginal effects are computed from the unconditional mean of inefficiency (Wang, 2002). In this paper we derive the marginal effects based on the Jondrow et al. estimator and use the bootstrap method to compute confidence intervals of the marginal effects.
Resumo:
The anticipated growth of air traffic worldwide requires enhanced Air Traffic Management (ATM) technologies and procedures to increase the system capacity, efficiency, and resilience, while reducing environmental impact and maintaining operational safety. To deal with these challenges, new automation and information exchange capabilities are being developed through different modernisation initiatives toward a new global operational concept called Trajectory Based Operations (TBO), in which aircraft trajectory information becomes the cornerstone of advanced ATM applications. This transformation will lead to higher levels of system complexity requiring enhanced Decision Support Tools (DST) to aid humans in the decision making processes. These will rely on accurate predicted aircraft trajectories, provided by advanced Trajectory Predictors (TP). The trajectory prediction process is subject to stochastic effects that introduce uncertainty into the predictions. Regardless of the assumptions that define the aircraft motion model underpinning the TP, deviations between predicted and actual trajectories are unavoidable. This thesis proposes an innovative method to characterise the uncertainty associated with a trajectory prediction based on the mathematical theory of Polynomial Chaos Expansions (PCE). Assuming univariate PCEs of the trajectory prediction inputs, the method describes how to generate multivariate PCEs of the prediction outputs that quantify their associated uncertainty. Arbitrary PCE (aPCE) was chosen because it allows a higher degree of flexibility to model input uncertainty. The obtained polynomial description can be used in subsequent prediction sensitivity analyses thanks to the relationship between polynomial coefficients and Sobol indices. The Sobol indices enable ranking the input parameters according to their influence on trajectory prediction uncertainty. The applicability of the aPCE-based uncertainty quantification detailed herein is analysed through a study case. This study case represents a typical aircraft trajectory prediction problem in ATM, in which uncertain parameters regarding aircraft performance, aircraft intent description, weather forecast, and initial conditions are considered simultaneously. Numerical results are compared to those obtained from a Monte Carlo simulation, demonstrating the advantages of the proposed method. The thesis includes two examples of DSTs (Demand and Capacity Balancing tool, and Arrival Manager) to illustrate the potential benefits of exploiting the proposed uncertainty quantification method.
Resumo:
This two-storey office building and upper floor interior fit-out, completed for the 25th anniversary of Adelaide-based construction firm, Badge Constructions, is a signature building for the client, and its recently established Brisbane-based operations, and a showpiece for their commercial and industrial construction prowess and dynamic, collaborative and transparent work ethic. Situated in the industrial precinct of Bulimba’s Oxford Street, the building is a continuation of the street’s nearby commercial heart, whilst its architectural language references the adjacent industrial structures. The building’s shed-like skillion roof and western wall have been considered as a folded plane, allowing space to be considered as the inhabitation of the inner surface of this plane. The analogy of a lined garment, tailored to suit its wearer, clarifies the relationship between the western façade plane’s unadorned, monochromatic outer surface and the coloured and patterned inner surface, celebrating inhabitation. The use of typically external construction materials are re-positioned as an integral part of the building’s interior fit-out, alluding to Badge’s construction repertoire, and weakening traditional barriers between interior and exterior commercial space. In reference to its Queensland context, the external glazed line of the building is pulled back from the street, providing an eastern verandah edge and a northern court, as a part of the public realm. The upper floor office incorporates a cantilevered outdoor mezzanine within the northern court, whilst the adjacent reception area and stairwell utilises clear glazing in order to visually connect to the street. The building is designed to take advantage of natural light to the east, whilst shading habitable spaces from the north, a building strategy that reduces solar heat gain and energy consumption. Placement of the building’s amenities core to the west provides substantial bracing and allows maximum activation of the north and east street edge. A collaborative design process has resulted in an affordable commercial building with a high level of design resolution and relationship to its Brisbane context, while also challenging the traditional relationships between exterior and interior commercial space, and informed client and consultant team of allied disciplines.