992 resultados para Proper


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discovering proper search intents is a vi- tal process to return desired results. It is constantly a hot research topic regarding information retrieval in recent years. Existing methods are mainly limited by utilizing context-based mining, query expansion, and user profiling techniques, which are still suffering from the issue of ambiguity in search queries. In this pa- per, we introduce a novel ontology-based approach in terms of a world knowledge base in order to construct personalized ontologies for identifying adequate con- cept levels for matching user search intents. An iter- ative mining algorithm is designed for evaluating po- tential intents level by level until meeting the best re- sult. The propose-to-attempt approach is evaluated in a large volume RCV1 data set, and experimental results indicate a distinct improvement on top precision after compared with baseline models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Demands for delivering high instantaneous power in a compressed form (pulse shape) have widely increased during recent decades. The flexible shapes with variable pulse specifications offered by pulsed power have made it a practical and effective supply method for an extensive range of applications. In particular, the release of basic subatomic particles (i.e. electron, proton and neutron) in an atom (ionization process) and the synthesizing of molecules to form ions or other molecules are among those reactions that necessitate large amount of instantaneous power. In addition to the decomposition process, there have recently been requests for pulsed power in other areas such as in the combination of molecules (i.e. fusion, material joining), gessoes radiations (i.e. electron beams, laser, and radar), explosions (i.e. concrete recycling), wastewater, exhausted gas, and material surface treatments. These pulses are widely employed in the silent discharge process in all types of materials (including gas, fluid and solid); in some cases, to form the plasma and consequently accelerate the associated process. Due to this fast growing demand for pulsed power in industrial and environmental applications, the exigency of having more efficient and flexible pulse modulators is now receiving greater consideration. Sensitive applications, such as plasma fusion and laser guns also require more precisely produced repetitive pulses with a higher quality. Many research studies are being conducted in different areas that need a flexible pulse modulator to vary pulse features to investigate the influence of these variations on the application. In addition, there is the need to prevent the waste of a considerable amount of energy caused by the arc phenomena that frequently occur after the plasma process. The control over power flow during the supply process is a critical skill that enables the pulse supply to halt the supply process at any stage. Different pulse modulators which utilise different accumulation techniques including Marx Generators (MG), Magnetic Pulse Compressors (MPC), Pulse Forming Networks (PFN) and Multistage Blumlein Lines (MBL) are currently employed to supply a wide range of applications. Gas/Magnetic switching technologies (such as spark gap and hydrogen thyratron) have conventionally been used as switching devices in pulse modulator structures because of their high voltage ratings and considerably low rising times. However, they also suffer from serious drawbacks such as, their low efficiency, reliability and repetition rate, and also their short life span. Being bulky, heavy and expensive are the other disadvantages associated with these devices. Recently developed solid-state switching technology is an appropriate substitution for these switching devices due to the benefits they bring to the pulse supplies. Besides being compact, efficient, reasonable and reliable, and having a long life span, their high frequency switching skill allows repetitive operation of pulsed power supply. The main concerns in using solid-state transistors are the voltage rating and the rising time of available switches that, in some cases, cannot satisfy the application’s requirements. However, there are several power electronics configurations and techniques that make solid-state utilisation feasible for high voltage pulse generation. Therefore, the design and development of novel methods and topologies with higher efficiency and flexibility for pulsed power generators have been considered as the main scope of this research work. This aim is pursued through several innovative proposals that can be classified under the following two principal objectives. • To innovate and develop novel solid-state based topologies for pulsed power generation • To improve available technologies that have the potential to accommodate solid-state technology by revising, reconfiguring and adjusting their structure and control algorithms. The quest to distinguish novel topologies for a proper pulsed power production was begun with a deep and through review of conventional pulse generators and useful power electronics topologies. As a result of this study, it appears that efficiency and flexibility are the most significant demands of plasma applications that have not been met by state-of-the-art methods. Many solid-state based configurations were considered and simulated in order to evaluate their potential to be utilised in the pulsed power area. Parts of this literature review are documented in Chapter 1 of this thesis. Current source topologies demonstrate valuable advantages in supplying the loads with capacitive characteristics such as plasma applications. To investigate the influence of switching transients associated with solid-state devices on rise time of pulses, simulation based studies have been undertaken. A variable current source is considered to pump different current levels to a capacitive load, and it was evident that dissimilar dv/dts are produced at the output. Thereby, transient effects on pulse rising time are denied regarding the evidence acquired from this examination. A detailed report of this study is given in Chapter 6 of this thesis. This study inspired the design of a solid-state based topology that take advantage of both current and voltage sources. A series of switch-resistor-capacitor units at the output splits the produced voltage to lower levels, so it can be shared by the switches. A smart but complicated switching strategy is also designed to discharge the residual energy after each supply cycle. To prevent reverse power flow and to reduce the complexity of the control algorithm in this system, the resistors in common paths of units are substituted with diode rectifiers (switch-diode-capacitor). This modification not only gives the feasibility of stopping the load supply process to the supplier at any stage (and consequently saving energy), but also enables the converter to operate in a two-stroke mode with asymmetrical capacitors. The components’ determination and exchanging energy calculations are accomplished with respect to application specifications and demands. Both topologies were simply modelled and simulation studies have been carried out with the simplified models. Experimental assessments were also executed on implemented hardware and the approaches verified the initial analysis. Reports on details of both converters are thoroughly discussed in Chapters 2 and 3 of the thesis. Conventional MGs have been recently modified to use solid-state transistors (i.e. Insulated gate bipolar transistors) instead of magnetic/gas switching devices. Resistive insulators previously used in their structures are substituted by diode rectifiers to adjust MGs for a proper voltage sharing. However, despite utilizing solid-state technology in MGs configurations, further design and control amendments can still be made to achieve an improved performance with fewer components. Considering a number of charging techniques, resonant phenomenon is adopted in a proposal to charge the capacitors. In addition to charging the capacitors at twice the input voltage, triggering switches at the moment at which the conducted current through switches is zero significantly reduces the switching losses. Another configuration is also introduced in this research for Marx topology based on commutation circuits that use a current source to charge the capacitors. According to this design, diode-capacitor units, each including two Marx stages, are connected in cascade through solid-state devices and aggregate the voltages across the capacitors to produce a high voltage pulse. The polarity of voltage across one capacitor in each unit is reversed in an intermediate mode by connecting the commutation circuit to the capacitor. The insulation of input side from load side is provided in this topology by disconnecting the load from the current source during the supply process. Furthermore, the number of required fast switching devices in both designs is reduced to half of the number used in a conventional MG; they are replaced with slower switches (such as Thyristors) that need simpler driving modules. In addition, the contributing switches in discharging paths are decreased to half; this decrease leads to a reduction in conduction losses. Associated models are simulated, and hardware tests are performed to verify the validity of proposed topologies. Chapters 4, 5 and 7 of the thesis present all relevant analysis and approaches according to these topologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The decision of whether a cell should live or die is fundamental for the wellbeing of all organisms. Despite intense investigation into cell growth and proliferation, only recently has the essential and equally important idea that cells control/programme their own demise for proper maintenance of cellular homeostasis gained recognition. Furthermore, even though research into programmed cell death (PCD) has been an extremely active area of research there are significant gaps in our understanding of the process in plants. In this review, we discuss PCD during plant development and pathogenesis, and compare/contrast this with mammalian apoptosis. © 2008 Blackwell Publishing Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since users have become the focus of product/service design in last decade, the term User eXperience (UX) has been frequently used in the field of Human-Computer-Interaction (HCI). Research on UX facilitates a better understanding of the various aspects of the user’s interaction with the product or service. Mobile video, as a new and promising service and research field, has attracted great attention. Due to the significance of UX in the success of mobile video (Jordan, 2002), many researchers have centered on this area, examining users’ expectations, motivations, requirements, and usage context. As a result, many influencing factors have been explored (Buchinger, Kriglstein, Brandt & Hlavacs, 2011; Buchinger, Kriglstein & Hlavacs, 2009). However, a general framework for specific mobile video service is lacking for structuring such a great number of factors. To measure user experience of multimedia services such as mobile video, quality of experience (QoE) has recently become a prominent concept. In contrast to the traditionally used concept quality of service (QoS), QoE not only involves objectively measuring the delivered service but also takes into account user’s needs and desires when using the service, emphasizing the user’s overall acceptability on the service. Many QoE metrics are able to estimate the user perceived quality or acceptability of mobile video, but may be not enough accurate for the overall UX prediction due to the complexity of UX. Only a few frameworks of QoE have addressed more aspects of UX for mobile multimedia applications but need be transformed into practical measures. The challenge of optimizing UX remains adaptations to the resource constrains (e.g., network conditions, mobile device capabilities, and heterogeneous usage contexts) as well as meeting complicated user requirements (e.g., usage purposes and personal preferences). In this chapter, we investigate the existing important UX frameworks, compare their similarities and discuss some important features that fit in the mobile video service. Based on the previous research, we propose a simple UX framework for mobile video application by mapping a variety of influencing factors of UX upon a typical mobile video delivery system. Each component and its factors are explored with comprehensive literature reviews. The proposed framework may benefit in user-centred design of mobile video through taking a complete consideration of UX influences and in improvement of mobile videoservice quality by adjusting the values of certain factors to produce a positive user experience. It may also facilitate relative research in the way of locating important issues to study, clarifying research scopes, and setting up proper study procedures. We then review a great deal of research on UX measurement, including QoE metrics and QoE frameworks of mobile multimedia. Finally, we discuss how to achieve an optimal quality of user experience by focusing on the issues of various aspects of UX of mobile video. In the conclusion, we suggest some open issues for future study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A service-oriented system is composed of independent software units, namely services, that interact with one another exclusively through message exchanges. The proper functioning of such system depends on whether or not each individual service behaves as the other services expect it to behave. Since services may be developed and operated independently, it is unrealistic to assume that this is always the case. This article addresses the problem of checking and quantifying how much the actual behavior of a service, as recorded in message logs, conforms to the expected behavior as specified in a process model.We consider the case where the expected behavior is defined using the BPEL industry standard (Business Process Execution Language for Web Services). BPEL process definitions are translated into Petri nets and Petri net-based conformance checking techniques are applied to derive two complementary indicators of conformance: fitness and appropriateness. The approach has been implemented in a toolset for business process analysis and mining, namely ProM, and has been tested in an environment comprising multiple Oracle BPEL servers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ICT is becoming a prominent part of healthcare delivery but brings with it information privacy concerns for patients and competing concerns by the caregivers. A proper balance between these issues must be established in order to fully utilise ICT capabilities in healthcare. Information accountability is a fairly new concept to computer science which focuses on fair use of information. In this paper we investigate the different issues that need to be addressed when applying information accountability principles to manage healthcare information. We briefly introduce an information accountability framework for handling electronic health records (eHR). We focus more on digital rights management by considering data in eHRs as digital assets and how we can represent privacy policies and data usage policies as these are key factors in accountability systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the availability of 3D full body scanners and the associated software systems for operations with large point clouds, 3D anthropometry has been marketed as a breakthrough and milestone in ergonomic design. The assumptions made by the representatives of the 3D paradigm need to be critically reviewed though. 3D anthropometry has advantages as well as shortfalls, which need to be carefully considered. While it is apparent that the measurement of a full body point cloud allows for easier storage of raw data and improves quality control, the difficulties in calculation of standardized measurements from the point cloud are widely underestimated. Early studies that made use of 3D point clouds to derive anthropometric dimensions have shown unacceptable deviations from the standardized results measured manually. While 3D human point clouds provide a valuable tool to replicate specific single persons for further virtual studies, or personalize garment, their use in ergonomic design must be critically assessed. Ergonomic, volumetric problems are defined by their 2-dimensional boundary or one dimensional sections. A 1D/2D approach is therefore sufficient to solve an ergonomic design problem. As a consequence, all modern 3D human manikins are defined by the underlying anthropometric girths (2D) and lengths/widths (1D), which can be measured efficiently using manual techniques. Traditionally, Ergonomists have taken a statistical approach to design for generalized percentiles of the population rather than for a single user. The underlying method is based on the distribution function of meaningful single and two-dimensional anthropometric variables. Compared to these variables, the distribution of human volume has no ergonomic relevance. On the other hand, if volume is to be seen as a two-dimensional integral or distribution function of length and girth, the calculation of combined percentiles – a common ergonomic requirement - is undefined. Consequently, we suggest to critically review the cost and use of 3D anthropometry. We also recommend making proper use of widely available single and 2-dimensional anthropometric data in ergonomic design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Air conditioning systems have become an integral part of many modern buildings. Proper design and operation of air conditioning systems have significant impact not only on the energy use and greenhouse gas emissions from the buildings, but also on the thermal comfort and productivity of the occupants. In this paper, the purpose and need of installing air conditioning systems is first introduced. The methods used for the classification of air conditioning systems are then presented. This is followed by a discussion on the pros and cons of each type of the air conditioning systems, including both common and new air conditioning technologies. The procedures used to design air conditioning systems are also outlined, and the implications of air conditioning systems, including design, selection, operation and maintenance, on building energy efficiency is also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Scientific research is an essential component in guiding improvements in health systems. There are no studies examining the Sri Lankan medical research output at international level. The present study evaluated the Sri Lankan research performance in medicine as reflected by the research publications output between years 2000-2009. Methods This study was based on Sri Lankan medical research publication data, retrieved from the SciVerse Scopus® from January 2000 to December 2009. The process of article selection was as follows: Affiliation - 'Sri Lanka' or 'Ceylon', Publication year - 'January 2000 to December 2009' and Subject area - 'Life and Health Sciences'. The articles identified were classified according to disease, medical speciality, institutions, major international collaborators, authors and journals. Results Sri Lanka's cumulative medical publications output between years 2000-2009 was 1,740 articles published in 160 different journals. The average annual publication growth rate was 9.1%. Majority of the articles were published in 'International' (n = 950, 54.6%) journals. Most articles were descriptive studies (n = 611, 35.1%), letters (n-345, 19.8%) and case reports (n = 311, 17.9%). The articles were authored by 148 different Sri Lankan authors from 146 different institutions. The three most prolific local institutions were Universities of; Colombo (n = 547), Kelaniya (n = 246) and Peradeniya (n = 222). Eighty four countries were found to have published collaborative papers with Sri Lankan authors during the last decade. UK was the largest collaborating partner (n = 263, 15.1%). Malaria (n = 75), Diabetes Mellitus (n = 55), Dengue (n = 53), Accidental injuries (n = 42) and Lymphatic filariasis (n = 40) were the major diseases studied. The 1,740 publications were cited 9,708 times, with an average citation of 5.6 per paper. The most cited paper had 203 citations, while there were 597 publications with no citations. The Sri Lankan authors' contribution to the global medical research output during the last decade was only 0.086%. Conclusion The Sri Lankan medical research output during the last decade is only a small fraction of the global research output. There it is a necessity to setup an enabling environment for research, with a proper vision, support, funds and training. In addition, collaborations across the region need to be strengthened to face common regional health challenges. Keywords: Sri Lanka, Medical research, Publication, Analysis

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The taxation of multinational banks currently is governed by the general principles of international tax. However, it is arguable that there are characteristics exclusive to multinational banks that may warrant the consideration of a separate taxing regime. This article argues that because of the unique nature of multinational banks, the traditional international tax rules governing jurisdiction to tax and allocation of income do not produce a result which is optimal, as it does not reflect economic reality. That is, the current system does not produce a result that accurately reflects the economic source of the income or the location of the economic activity. The suggested alternative is unitary taxation using global formulary apportionment. Formulary apportionment is considered as an alternative that reflects economic reality by recognising the unique nature of multinational banks and allocating the income to the location of the economic activity. The unique nature of multinational banking is recognised in the fact that formulary apportionment does not attempt to undertake a transactional division of a highly integrated multinational entity. Rather, it allocates income to the jurisdictions based on an economically justifiable formula. Starting from this recognition, the purpose of this article is to demonstrate that formulary apportionment is a theoretically superior (or optimal) model for the taxation of multinational banks. An optimal regime, for the purposes of this article, is considered to be one that distributes the taxing rights in an equitable manner between the relevant jurisdictions, while, simultaneously allowing decisions of the international banks to be tax neutral. In this sense, neutrality is viewed as an economic concept and equity is regarded as a legal concept. A neutral tax system is one in which tax rules do not affect economic choices about commercial activities. Neutrality will ideally be across jurisdictions as well as across traditional and non-traditional industries. The primary focus of this article is jurisdictional neutrality. A system that distributes taxing rights in an equitable manner between the relevant jurisdictions ensures that each country receives its fair share of tax revenue. Given the increase in multinational banking, jurisdictions should be concerned that they are receiving their fair share. Inter-nation equity is concerned with re-determining the proper division of the tax base among countries. Richard and Peggy Musgrave argue that sharing of the tax base by countries of source should be seen as a matter of inter-nation equity requiring international cooperation. The rights of the jurisdiction of residency will also be at issue. To this extent, while it is agreed that inter-nation equity is an essential attribute to an international tax regime, there is no universal agreement as to how to achieve it. The current system attempts to achieve such equity through a combined residency and source regime, with the transfer pricing rules used to apportion income between the relevant jurisdictions. However, this article suggests, that as an alternative to the current regime, equity would be achieved through formulary apportionment. Opposition to formulary apportionment is generally based on the argument that it is not a theoretically superior (or optimal) model because of the implementation difficulties. Yet these are two separate issues. As such, this article is divided into two core parts. The first part examines the theoretical soundness of the formulary apportionment model concluding that it is theoretically superior to the arm’s length pricing requirement of the traditional transfer pricing regime. The second part examines the practical implications of accepting formulary apportionment as an optimal model with a view to disclosing the issues that arise when a formulary apportionment regime is adopted. Prior to an analysis of the theoretical and practical application of formulary apportionment to multinational banks, the unique nature of these banks is considered. The article concludes that, while there are significant implementation, compliance, and enforcement issues to overcome, the unitary taxation model may be theoretically superior to the current arm’s length model which applies to multinational banks. This conclusion is based on the unitary taxation model providing greater alignment with the unique features of these banks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heat transfer through an attic space into or out of buildings is an important issue for attic-shaped houses in both hot and cold climates. One of the important objectives for design and construction of houses is to provide thermal comfort for occupants. In the present energy-conscious society, it is also a requirement for houses to be energy efficient, i.e. the energy consumption for heating or air-conditioning houses must be minimized. Relevant to these objectives, research into heat transfer in attics has been conducted for about three decades. The transient behaviour of an attic space is directly relevant to our daily life. Certain periods of the day or night may be considered as having a constant ambient temperature (e.g. during 11am - 2pm or 11pm - 2am). However, at other times during the day or night the ambient temperature changes with time (e.g. between 5am - 9am or 5pm - 9pm). Therefore, the analysis of steady state solution is not sufficient to describe the fluid flow and heat transfer in the attic space. The discussion of the transient development of the boundary is required. A theoretical understanding of the transient behaviour of the flow in the enclosure is performed through scaling analysis for sudden and ramp heating conditions. A proper identification of the timescales, the velocity and the thickness relevant to the flow that develops inside the cavity makes it possible to predict theoretically the basic flow features that will survive once the thermal flow in the enclosure reaches a steady state. Those scaling predictions have been verified by a series of numerical simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The fatality and injury rate of motorcyclists per registered vehicle are higher than those of other motor vehicles by 13 and 7 times respectively. The crash involvement rate of motorcyclists as a victim party is 58% at intersections and as an offending party is 67% at expressways. Previous research efforts showed that the motorcycle safety programs are not very effective in improving motorcycle safety. This is perhaps due to inefficient design of safety program as specific causal factors may not be well explored. The objective of this study is to propose more sophisticated countermeasures and awareness programs for improving motorcycle safety after analyzing specific causal factors for motorcycle crashes at intersections and expressways. Methodologically this study applies the binary logistic model to explore the at-fault or not-at-fault crash involvement of motorcyclists at those locations. A number of explanatory variables representing roadway characteristics, environmental factors, motorcycle descriptions, and rider demographics have been evaluated. Results shows that the night time crash occurrence, presence of red light camera, lane position, rider age, licence class, and multivehicle collision significantly affect the fault of motorcyclists involved in crashes at intersections. On the other hand, the night time crash occurrence, lane position, speed limit, rider age, licence class, engine capacity, riding with pillion passenger, foreign registered motorcycles, and multivehicle collision has been found to be significant at expressways. Legislate to wear reflective clothes and using reflective markings on the motorcycles and helmets are suggested as an effective countermeasure for reducing their vulnerability. The red light cameras at intersections reduce the vulnerability of motorcycles and hence motorcycle flow and motorcycle crashes should be considered during installation of red light cameras. At signalized intersections, motorcyclists may be taught to follow correct movement and queuing rather than weaving through the traffic as it leads them to become victims of other motorists. The riding simulators in the training centers can be useful to demonstrate the proper movement and queuing at junctions. Riding with pillion passenger and excess speed at expressways are found to significantly influence the at at-fault crash involvement of the motorcyclists. Hence the motorcyclists should be advised to concentrate more on riding while riding with pillion passenger and encouraged to avoid excess speed at expressways. Very young and very older group of riders are found to be at-fault than middle aged groups. Hence this group of riders should be targeted for safety improvement. This can be done by arranging safety talks and programs in motorcycling clubs in colleges and universities as well as community riding clubs with high proportion of elderly riders. It is recommended that the driving centers may use the findings of this study to include in licensure program to make motorcyclists more aware of the different factors which expose the motorcyclists to crash risks so that more defensive riding may be needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data quality has become a major concern for organisations. The rapid growth in the size and technology of a databases and data warehouses has brought significant advantages in accessing, storing, and retrieving information. At the same time, great challenges arise with rapid data throughput and heterogeneous accesses in terms of maintaining high data quality. Yet, despite the importance of data quality, literature has usually condensed data quality into detecting and correcting poor data such as outliers, incomplete or inaccurate values. As a result, organisations are unable to efficiently and effectively assess data quality. Having an accurate and proper data quality assessment method will enable users to benchmark their systems and monitor their improvement. This paper introduces a granules mining for measuring the random degree of error data which will enable decision makers to conduct accurate quality assessment and allocate the most severe data, thereby providing an accurate estimation of human and financial resources for conducting quality improvement tasks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The potential of multiple distribution static synchronous compensators (DSTATCOMs) to improve the voltage profile of radial distribution networks has been reported in the literature by few authors. However, the operation of multiple DSTATCOMs across a distribution feeder may introduce control interactions and/or voltage instability. This study proposes a control scheme that alleviates interactions among controllers and enhances proper reactive power sharing among DSTATCOMs. A generalised mathematical model is presented to analyse the interactions among any number of DSTATCOMs in the network. The criterion for controller design is developed by conducting eigenvalue analysis on this mathematical model. The proposed control scheme is tested in time domain on a sample radial distribution feeder installed with multiple DSTATCOMs and test results are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The transmission of bacteria is more likely to occur from wet skin than from dry skin; therefore, the proper drying of hands after washing should be an integral part of the hand hygiene process in health care. This article systematically reviews the research on the hygienic efficacy of different hand-drying methods. A literature search was conducted in April 2011 using the electronic databases PubMed, Scopus, and Web of Science. Search terms used were hand dryer and hand drying. The search was limited to articles published in English from January 1970 through March 2011. Twelve studies were included in the review. Hand-drying effectiveness includes the speed of drying, degree of dryness, effective removal of bacteria, and prevention of cross-contamination. This review found little agreement regarding the relative effectiveness of electric air dryers. However, most studies suggest that paper towels can dry hands efficiently, remove bacteria effectively, and cause less contamination of the washroom environment. From a hygiene viewpoint, paper towels are superior to electric air dryers. Paper towels should be recommended in locations where hygiene is paramount, such as hospitals and clinics.