905 resultados para positional advantage
Resumo:
Semantic perception and object labeling are key requirements for robots interacting with objects on a higher level. Symbolic annotation of objects allows the usage of planning algorithms for object interaction, for instance in a typical fetchand-carry scenario. In current research, perception is usually based on 3D scene reconstruction and geometric model matching, where trained features are matched with a 3D sample point cloud. In this work we propose a semantic perception method which is based on spatio-semantic features. These features are defined in a natural, symbolic way, such as geometry and spatial relation. In contrast to point-based model matching methods, a spatial ontology is used where objects are rather described how they "look like", similar to how a human would described unknown objects to another person. A fuzzy based reasoning approach matches perceivable features with a spatial ontology of the objects. The approach provides a method which is able to deal with senor noise and occlusions. Another advantage is that no training phase is needed in order to learn object features. The use-case of the proposed method is the detection of soil sample containers in an outdoor environment which have to be collected by a mobile robot. The approach is verified using real world experiments.
Resumo:
For wind farm optimizations with lands belonging to different owners, the traditional penalty method is highly dependent on the type of wind farm land division. The application of the traditional method can be cumbersome if the divisions are complex. To overcome this disadvantage, a new method is proposed in this paper for the first time. Unlike the penalty method which requires the addition of penalizing term when evaluating the fitness function, it is achieved through repairing the infeasible solutions before fitness evaluation. To assess the effectiveness of the proposed method on the optimization of wind farm, the optimizing results of different methods are compared for three different types of wind farm division. Different wind scenarios are also incorporated during optimization which includes (i) constant wind speed and wind direction; (ii) various wind speed and wind direction, and; (iii) the more realisticWeibull distribution. Results show that the performance of the new method varies for different land plots in the tested cases. Nevertheless, it is found that optimum or at least close to optimum results can be obtained with sequential land plot study using the new method for all cases. It is concluded that satisfactory results can be achieved using the proposed method. In addition, it has the advantage of flexibility in managing the wind farm design, which not only frees users to define the penalty parameter but without limitations on the wind farm division.
Resumo:
We report rapid and ultra-sensitive detection system for 2,4,6-trinitrotoluene (TNT) using unmodified gold nanoparticles and surface-enhanced Raman spectroscopy (SERS). First, Meisenheimer complex has been formed in aqueous solution between TNT and cysteamine in less than 15 min of mixing. The complex formation is confirmed by the development of a pink colour and a new UV–vis absorption band around 520 nm. Second, the developed Meisenheimer complex is spontaneously self-assembled onto unmodified gold nanoparticles through a stable Au–S bond between the cysteamine moiety and the gold surface. The developed mono layer of cysteamine-TNT is then screened by SERS to detect and quantify TNT. Our experimental results demonstrate that the SERS-based assay provide an ultra-sensitive approach for the detection of TNT down to 22.7 ng/L. The unambiguous fingerprint identification of TNT by SERS represents a key advantage for our proposed method. The new method provides high selectivity towards TNT over 2,4 DNT and picric acid. Therefore it satisfies the practical requirements for the rapid screening of TNT in real life samples where the interim 24-h average allowable concentration of TNT in waste water is 0.04 mg/L.
Resumo:
This paper describes a concept for supporting distributed hands-on collaboration through interaction design for the physical and the digital workspace. The Blended Interaction Spaces concept creates distributed work environments in which collaborating parties all feel that they are present “here” rather than “there”. We describe thinking and inspirations behind the Blended Interaction Spaces concept, and summarize findings from fieldwork activities informing our design. We then exemplify the Blended Interaction Spaces concept through a prototype implementation of one of four concepts.
Resumo:
By taking the advantage of the excellent mechanical properties and high specific surface area of graphene oxide (GO) sheets, we develop a simple and effective strategy to improve the interlaminar mechanical properties of carbon fiber reinforced plastic (CFRP) laminates. With the incorporation of graphene oxide reinforced epoxy interleaf into the interface of CFRP laminates, the Mode-I fracture toughness and resistance were greatly increased. The experimental results of double cantilever beam (DCB) tests demonstrated that, with 2 g/m2 addition of GO, the Mode-I fracture toughness and resistance of the specimen increase by 170.8% and 108.0%, respectively, compared to those of the plain specimen. The improvement mechanisms were investigated by the observation of fracture surface with scanning electron microscopies. Moreover, finite element analyses were performed based on the cohesive zone model to verify the experimental fracture toughness and to predict the interfacial tensile strength of CFRP laminates.
Resumo:
Movement of malaria across international borders poses a major obstacle to achieving malaria elimination in the 34 countries that have committed to this goal. In border areas, malaria prevalence is often higher than in other areas due to lower access to health services, treatment-seeking behaviour of marginalised populations that typically inhabit border areas, difficulties in deploying prevention programs to hard-to-reach communities, often in difficult terrain, and constant movement of people across porous national boundaries. Malaria elimination in border areas will be challenging, and key to addressing the challenges is strengthening of surveillance activities for rapid identification of any importation or reintroduction of malaria. This could involve taking advantage of technological advances, such as spatial decision support systems, which can be deployed to assist program managers to carry out preventive and reactive measures, and mobile phone technology, which can be used to capture the movement of people in the border areas and likely sources of malaria importation. Additionally, joint collaboration in the prevention and control of cross-border malaria by neighbouring countries, and reinforcement of early diagnosis and prompt treatment are ways forward in addressing the problem of cross-border malaria.
Resumo:
The comments I make are based on my nearly twenty years involvement in the dementia cause at both a national and international level. In preparation, I read two papers namely the Ministerial Dementia Forum – Option Paper produced by KPMG Management Consultants (2014) and Analysis of Dementia Programmes and Services Funded by the Department of Social Services: Conversation Starter prepared by KPMG as a preparation document for those attending a workshop in Brisbane on April 22nd 2015. Dementia is a complex “syndrome” and as is often said, “when you meet one person with dementia, you have met one” meaning that no two persons with dementia are the same. Even in dementia care, Australia is a “lucky country” and there is much to be said for the quality and diversity of dementia care available for people living with dementia. Despite this, I agree with the many views expressed in the material I read that there is scope for improvement, especially in the way that services are coordinated. In saying that, I do not purport to have all the solutions nor claim to have the knowledge required to comment on all the programs covered by this review. If I appear to be a “biased” advocate for Alzheimer’s Australia across the States and Territories, it is because I have seen constant evidence of ordinary people doing extraordinary things with inadequate resources. Dementia care is not cheap and if those funding dementia services are primarily only interested in economic outcomes and benefits, the real purpose of this consultation will be defeated. In addition, nowhere in the material I have read is there any recognition that in many instances program funding is a complex mix of government (at all levels) and private funding. This makes reviewing those programs more complex and less able to be coordinated at a Departmental level. It goes without saying therefore that the Federal Government is not” the only player in this game”. Of all those participating in this review, Alzheimer’s Australia is best placed to comment on programs as it is more connected to people living with dementia and has probably the best record of consulting with them. It would appear however that their role has been reduced to that of a “bit player”. Without wanting to be critical, the Forum Report which deals with the comments made at a gathering of 70 individuals and organisations, only three (3) or 4.28% were actual carers of people living with dementia. Even if it is argued that a number of organisations present represented consumers, the percentage goes up only marginally to 8.57% which is hardly an endorsement of the forum being “consumer driven”. The predominance of those present were service providers, each with their own agenda and each seeking advantage for their “business”. The final point I want to make before commenting on more specific, program related issues, is that many programs being reviewed have a much longer history than is reflected in the material I have read. Their growth and development was pioneered by Alzheimer’s Australia organisations across the country often with no government funding. Attempts to bring about better coordination of programs were often at the behest of Alzheimer’s Australia but in the main were ignored. The opportunity to now put this right is long overdue.
Resumo:
India’s desire to transform itself into an international military power has brought about a rapid shift in its approach to procuring military hardware. The indigenization of India’s military manufacturing capacity forms an integral part of the strategic objectives of Indian military services, with its realization being a function of significant government investment in strategic technologies. This has a number of ramifications. An indigenous Indian military capacity, particularly in the field of aviation, forms a key part of India’s ambition of achieving regional air superiority, or even supremacy, and being capable of power projection. This is particularly in response to China’s increasing presence in South Asian airspace. A burgeoning Indian military manufacturing machine based on a comparative advantage in skilled technicians and lower-cost labour, together with strategic collaboration with foreign military hardware manufacturers, may also lead to neighbouring countries looking to India as a source of competitively priced military hardware. In short, this chapter seeks to analyse the rationale behind India’s attempt to become militarily self-sufficient in the field of aviation, discuss the technical, economic and political context in which it is achieving this transformation, and assess the potential outlook of success for India’s drive to achieve self-sufficiency in the arena of military aviation. This chapter will do so by using the case of India’s attempt to develop a fifth-generation fighter aircraft.
Resumo:
In this paper, we assess whether quality survives the test of time in academia by comparing up to 80 years of academic journal article citations from two top journals, Econometrica and the American Economic Review. The research setting under analysis is analogous to a controlled real world experiment in that it involves a homogeneous task (trying to publish in top journals) by individuals with a homogenous job profile (academics) in a specific research environment (economics and econometrics). Comparing articles published concurrently in the same outlet at the same time (same issue) indicates that symbolic capital or power due to institutional affiliation or connection does seem to boost citation success at the beginning, giving those educated at or affiliated with leading universities an initial comparative advantage. Such advantage, however, does not hold in the long run: at a later stage,the publications of other researchers become as or even more successful.
Resumo:
Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.
Resumo:
In estuaries and natural water channels, the estimate of velocity and dispersion coefficients is critical to the knowledge of scalar transport and mixing. This estimate is rarely available experimentally at sub-tidal time scale in shallow water channels where high frequency is required to capture its spatio-temporal variation. This study estimates Lagrangian integral scales and autocorrelation curves, which are key parameters for obtaining velocity fluctuations and dispersion coefficients, and their spatio-temporal variability from deployments of Lagrangian drifters sampled at 10 Hz for a 4-hour period. The power spectral densities of the velocities between 0.0001 and 0.8 Hz were well fitted with a slope of 5/3 predicted by Kolmogorov’s similarity hypothesis within the inertial subrange, and were similar to the Eulerian power spectral previously observed within the estuary. The result showed that large velocity fluctuations determine the magnitude of the integral time scale, TL. Overlapping of short segments improved the stability of the estimate of TL by taking advantage of the redundant data included in the autocorrelation function. The integral time scales were about 20 s and varied by up to a factor of 8. These results are essential inputs for spatial binning of velocities, Lagrangian stochastic modelling and single particle analysis of the tidal estuary.
Resumo:
Background: Prediction of outcome after stroke is important for triage decisions, prognostic estimates for family and for appropriate resource utilization. Prognostication must be timely and simply applied. Several scales have shown good prognostic value. In Calgary, the Orpington Prognostic Score (OPS) has been used to predict outcome as an aid to rehabilitation triage. However, the OPS has not been assessed at one week for predictive capability. Methods: Among patients admitted to a sub-acute stroke unit, OPS from the first week were examined to determine if any correlation existed between final disposition after rehabilitation and first week score. The predictive validity of the OPS at one week was compared to National Institute of Health Stroke Scale (NIHSS) score at 24 hours using logistic regression and receiver operator characteristics analysis. The primary outcome was final disposition after discharge from the stroke unit if the patient went directly home, or died, or from the inpatient rehabilitation unit. Results: The first week OPS was highly predictive of final disposition. However, no major advantage in using the first week OPS was observed when compared to 24h NIHSS score. Both scales were equally predictive of final disposition of stroke patients, post rehabilitation. Conclusion: The first week OPS can be used to predict final outcome. The NIHSS at 24h provides the same prognostic information.
Resumo:
This paper asks the question to what scale and speed does society need to reduce its ecological footprint and improve resource productivity to prevent further overshoot and return within the ecological limits of the earth’s ecological life support systems? How fast do these changes need to be achieved? The paper shows that now a large range of studies find that engineering sustainable solutions need to be roughly an order or magnitude resource productivity improvement (sometimes called a Factor of 10, or a 90% reduction) by 2050 to achieve real and lasting ecological sustainability. This marks a significant challenge for engineers – indeed all designers and architects, where best practice in engineering sustainable solutions will need to achieve large resource productivity targets. The paper brings together examples of best practice in achieving these large targets from around the world. The paper also highlights key resources and texts for engineers who wish to learn how to do it. But engineers need to be realistic and patient. Significant barriers exist to achieving Factor 4-10 such as the fact that infrastructure and technology rollover and replacement is often slow. This slow rollover of the built environment and technology is the context within which most engineers work, making the goal of achieving Factor 10 all the more challenging. However, the paper demonstrates that by using best practice in engineering sustainable solutions and by addressing the necessary market, information and institutional failures it is possible to achieve Factor 10 over the next 50 years. This paper draws on recent publications by The Natural Edge Project (TNEP) and partners, including Hargroves, K. Smith, M. (Eds) (2005) The Natural Advantage of Nations: Business Opportunities, Innovation and Governance for the 21st Century, and the TNEP Engineering Sustainable Solutions Program - Critical Literacies for Engineers Portfolio. Both projects have the significant support of Engineers Australia. its College of Environmental Engineers and the Society of Sustainability and Environmental Engineering.
Resumo:
The last three decades have been difficult for companies and industry. In an increasingly competitive international business climate with shifting national environmental regulations, higher standards are being demanded by the consumer and community groups, not-to-mention the escalating cost of primary resources such as water, steel and minerals. The cause of these pressures is the traditional notion held by business executives and engineers that there is an inherent trade off between eco-efficiency and improving the economic bottom line. However there is significant evidence and examples of best practice to show that there is in fact no trade-off between the environment and the economy if sustainable development through continual improvement is adopted. It is highly possible therefore for companies to make a profitable transition towards sustainable business practice, where along the transition significant business opportunities can be taken advantage of. Companies are by their very nature dynamic, influential and highly capable of adapting to change. Making an organisational transformation to a sustainable business is not outside the capacity of the typical company, who know much of what is needed already to change their activities to satisfy current market demands while achieving competitiveness. However in order to make the transition towards sustainable business practice companies require some key mechanisms such as accurate information on methodologies and opportunities, understanding of the financial and non-financial incentives, permission from stakeholders and shareholders, understanding of the emerging market opportunities, a critical mass of leaders in their sector and demonstrated case studies, and awarding appropriate risk-taking activities undertaken by engineers and CEOs. Satisfying these requirements will adopt an innovative culture within the company that strives for continual improvement and successfully transforms itself to achieve competitiveness in the 21st Century. This paper will summarise the experiences of The Natural Edge Project (TNEP) and its partners in assisting organisations to make a profitable transition towards sustainable business practice through several initiatives. The Natural Advantage of Nations publication provides the critical information required by business leaders and engineers to set the context of sustainable business practice. The Profiting in a Carbon Constrained World report, developed with Natural Capitalism Inc led by Hunter Lovins, summarises the opportunities available to companies to take advantage of the carbon trading market mechanisms such as the Chicago Climate Exchange and European Climate Exchange. The Sustainability Helix then guides the company through the transition by identifying the key tools and methodologies required by companies to reduce environmental loading while dramatically improving resource productivity and achieving competitiveness. Finally, the Engineering Sustainable Solutions Program delivers the key engineering information required by companies and university departments to deliver sustainable engineering solutions. The initiatives are of varying complexity and level of application, however all are designed to provide key staff the critical information required to make a profitable transition towards sustainable business practice. It is then their responsibility to apply and teach their knowledge to the rest of the organisation.
Resumo:
We used event-related fMRI to investigate the neural correlates of encoding strength and word frequency effects in recognition memory. At test, participants made Old/New decisions to intermixed low (LF) and high frequency (HF) words that had been presented once or twice at study and to new, unstudied words. The Old/New effect for all hits vs. correctly rejected unstudied words was associated with differential activity in multiple cortical regions, including the anterior medial temporal lobe (MTL), hippocampus, left lateral parietal cortex and anterior left inferior prefrontal cortex (LIPC). Items repeated at study had superior hit rates (HR) compared to items presented once and were associated with reduced activity in the right anterior MTL. By contrast, other regions that had shown conventional Old/New effects did not demonstrate modulation according to memory strength. A mirror effect for word frequency was demonstrated, with the LF word HR advantage associated with increased activity in the left lateral temporal cortex. However, none of the regions that had demonstrated Old/New item retrieval effects showed modulation according to word frequency. These findings are interpreted as supporting single-process memory models proposing a unitary strength-like memory signal and models attributing the LF word HR advantage to the greater lexico-semantic context-noise associated with HF words due to their being experienced in many pre-experimental contexts.