27 resultados para Electronics in military engineering.


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Causing civilian casualties during military operations has become a much politicised topic in international relations since the Second World War. Since the last decade of the 20th century, different scholars and political analysts have claimed that human life is valued more and more among the general international community. This argument has led many researchers to assume that democratic culture and traditions, modern ethical and moral issues have created a desire for a world without war or, at least, a demand that contemporary armed conflicts, if unavoidable, at least have to be far less lethal forcing the military to seek new technologies that can minimise civilian casualties and collateral damage. Non-Lethal Weapons (NLW) – weapons that are intended to minimise civilian casualties and collateral damage – are based on the technology that, during the 1990s, was expected to revolutionise the conduct of warfare making it significantly less deadly. The rapid rise of interest in NLW, ignited by the American military twenty five years ago, sparked off an entirely new military, as well as an academic, discourse concerning their potential contribution to military success on the 21st century battlefields. It seems, however, that except for this debate, very little has been done within the military forces themselves. This research suggests that the roots of this situation are much deeper than the simple professional misconduct of the military establishment, or the poor political behaviour of political leaders, who had sent them to fight. Following the story of NLW in the U.S., Russia and Israel this research focuses on the political and cultural aspects that have been supposed to force the military organisations of these countries to adopt new technologies and operational and organisational concepts regarding NLW in an attempt to minimise enemy civilian casualties during their military operations. This research finds that while American, Russian and Israeli national characters are, undoubtedly, products of the unique historical experience of each one of these nations, all of three pay very little regard to foreigners’ lives. Moreover, while it is generally argued that the international political pressure is a crucial factor that leads to the significant reduction of harmed civilians and destroyed civilian infrastructure, the findings of this research suggest that the American, Russian and Israeli governments are well prepared and politically equipped to fend off international criticism. As the analyses of the American, Russian and Israeli cases reveal, the political-military leaderships of these countries have very little external or domestic reasons to minimise enemy civilian casualties through fundamental-revolutionary change in their conduct of war. In other words, this research finds that employment of NLW have failed because the political leadership asks the militaries to reduce the enemy civilian casualties to a politically acceptable level, rather than to the technologically possible minimum; as in the socio-cultural-political context of each country, support for the former appears to be significantly higher than for the latter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a parallel genetic algorithm to the Steiner Problem in Networks. Several previous papers have proposed the adoption of GAs and others metaheuristics to solve the SPN demonstrating the validity of their approaches. This work differs from them for two main reasons: the dimension and the characteristics of the networks adopted in the experiments and the aim from which it has been originated. The reason that aimed this work was namely to build a comparison term for validating deterministic and computationally inexpensive algorithms which can be used in practical engineering applications, such as the multicast transmission in the Internet. On the other hand, the large dimensions of our sample networks require the adoption of a parallel implementation of the Steiner GA, which is able to deal with such large problem instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern buildings are designed to enhance the match between environment, spaces and the people carrying out work, so that the well-being and the performance of the occupants are all in harmony. Building services are systems that facilitate a healthy working environment within which workers productivity can be optimised in the buildings. However, the maintenance of these services is fraught with problems that may contribute to up to 50% of the total life cycle cost of the building. Maintenance support is one area which is not usually designed into the system as this is not common practice in the services industry. The other areas of shortfall for future designs are; client requirements, commissioning, facilities management data and post occupancy evaluation feedback which needs to be adequately planned to capture and document this information for use in future designs. At the University of Reading an integrated approach has been developed to assemble the multitude of aspects inherent in this field. The means records required and measured achievements for the benefit of both building owners and practitioners. This integrated approach can be represented in a Through Life Business Model (TLBM) format using the concept of Integrated Logistic Support (ILS). The prototype TLBM developed utilises the tailored tools and techniques of ILS for building services. This TLBM approach will facilitate the successful development of a databank that would be invaluable in capturing essential data (e.g. reliability of components) for enhancing future building services designs, life cycle costing and decision making by practitioners, in particular facilities managers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of biological materials has long been recognized from the molecular level to higher levels of organization. Whereas, in traditional engineering, hardness and stiffness are considered desirable properties in a material, biology makes considerable and advantageous use of softer, more pliable resources. The development, structure and mechanics of these materials are well documented and will not be covered here. The purpose of this paper is, however, to demonstrate the importance of such materials and, in particular, the functional structures they form. Using only a few simple building blocks, nature is able to develop a plethora of diverse materials, each with a very different set of mechanical properties and from which a seemingly impossibly large number of assorted structures are formed. There is little doubt that this is made possible by the fact that the majority of biological ‘materials’ or ‘structures’ are based on fibres and that these fibres provide opportunities for functional hierarchies. We show how these structures have inspired a new generation of innovative technologies in the science and engineering community. Particular attention is given to the use of insects as models for biomimetically inspired innovations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The UK industry has been criticised for being slow to adopt construction process innovations. Research shows that the idiosyncrasies of participants, their roles in the system and the contextual differences between sections of the industry make this a highly complex problem. There is considerable evidence that informal social networks play a key role in diffusion of innovations. The aim is to identify informal communication networks of project participants and the role these play in the diffusion of construction innovations. The characteristics of this network will be analysed in order to understand how they can be used to accelerate innovation diffusion within and between projects. Social Network Analysis is used to determine informal communication routes. Control and experiment case study projects are used within two different organizations. This allows informal communication routes concerning innovations to be mapped, whilst testing if the informal routes can facilitate diffusion. Analysis will focus upon understanding the combination of informal strong and weak ties, and how these impede or facilitate the diffusion of the innovation. Initial work suggests the presence of an informal communication network. Actors within this informal network, and the organization's management are unaware of its' existence and their informal roles within it. Thus, the network remains an untapped medium regarding innovation diffusion. It is proposed that successful innovation diffusion is dependent upon understanding informal strong and weak ties, at project, organization and industry level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequency recognition is an important task in many engineering fields such as audio signal processing and telecommunications engineering, for example in applications like Dual-Tone Multi-Frequency (DTMF) detection or the recognition of the carrier frequency of a Global Positioning, System (GPS) signal. This paper will present results of investigations on several common Fourier Transform-based frequency recognition algorithms implemented in real time on a Texas Instruments (TI) TMS320C6713 Digital Signal Processor (DSP) core. In addition, suitable metrics are going to be evaluated in order to ascertain which of these selected algorithms is appropriate for audio signal processing(1).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work reported in this paper proposes 'Intelligent Agents', a Swarm-Array computing approach focused to apply autonomic computing concepts to parallel computing systems and build reliable systems for space applications. Swarm-array computing is a robotics a swarm robotics inspired novel computing approach considered as a path to achieve autonomy in parallel computing systems. In the intelligent agent approach, a task to be executed on parallel computing cores is considered as a swarm of autonomous agents. A task is carried to a computing core by carrier agents and can be seamlessly transferred between cores in the event of a predicted failure, thereby achieving self-* objectives of autonomic computing. The approach is validated on a multi-agent simulator.