951 resultados para Approach that generates sense


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The best-effort nature of the Internet poses a significant obstacle to the deployment of many applications that require guaranteed bandwidth. In this paper, we present a novel approach that enables two edge/border routers-which we call Internet Traffic Managers (ITM)-to use an adaptive number of TCP connections to set up a tunnel of desirable bandwidth between them. The number of TCP connections that comprise this tunnel is elastic in the sense that it increases/decreases in tandem with competing cross traffic to maintain a target bandwidth. An origin ITM would then schedule incoming packets from an application requiring guaranteed bandwidth over that elastic tunnel. Unlike many proposed solutions that aim to deliver soft QoS guarantees, our elastic-tunnel approach does not require any support from core routers (as with IntServ and DiffServ); it is scalable in the sense that core routers do not have to maintain per-flow state (as with IntServ); and it is readily deployable within a single ISP or across multiple ISPs. To evaluate our approach, we develop a flow-level control-theoretic model to study the transient behavior of established elastic TCP-based tunnels. The model captures the effect of cross-traffic connections on our bandwidth allocation policies. Through extensive simulations, we confirm the effectiveness of our approach in providing soft bandwidth guarantees. We also outline our kernel-level ITM prototype implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phosphorylation of GTP-binding-regulatory (G)-protein-coupled receptors by specific G-protein-coupled receptor kinases (GRKs) is a major mechanism responsible for agonist-mediated desensitization of signal transduction processes. However, to date, studies of the specificity of these enzymes have been hampered by the difficulty of preparing the purified and reconstituted receptor preparations required as substrates. Here we describe an approach that obviates this problem by utilizing highly purified membrane preparations from Sf9 and 293 cells overexpressing G-protein-coupled receptors. We use this technique to demonstrate specificity of several GRKs with respect to both receptor substrates and the enhancing effects of G-protein beta gamma subunits on phosphorylation. Enriched membrane preparations of the beta 2- and alpha 2-C2-adrenergic receptors (ARs, where alpha 2-C2-AR refers to the AR whose gene is located on human chromosome 2) prepared by sucrose density gradient centrifugation from Sf9 or 293 cells contain the receptor at 100-300 pmol/mg of protein and serve as efficient substrates for agonist-dependent phosphorylation by beta-AR kinase 1 (GRK2), beta-AR kinase 2 (GRK3), or GRK5. Stoichiometries of agonist-mediated phosphorylation of the receptors by GRK2 (beta-AR kinase 1), in the absence and presence of G beta gamma, are 1 and 3 mol/mol, respectively. The rate of phosphorylation of the membrane receptors is 3 times faster than that of purified and reconstituted receptors. While phosphorylation of the beta 2-AR by GRK2, -3, and -5 is similar, the activity of GRK2 and -3 is enhanced by G beta gamma whereas that of GRK5 is not. In contrast, whereas GRK2 and -3 efficiently phosphorylate alpha 2-C2-AR, GRK5 is quite weak. The availability of a simple direct phosphorylation assay applicable to any cloned G-protein-coupled receptor should greatly facilitate elucidation of the mechanisms of regulation of these receptors by the expanding family of GRKs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Food insecurity, chronic hunger, starvation and malnutrition continue to affect millions of individuals throughout the developing world, especially Sub-Saharan Africa. Various initiatives by African governments and International Agencies such as the UN, the industrial nations, the International Monetary Fund, the World Bank and the World Trade Organisation to boost economic development, have failed to provide the much-needed solution to these challenges. The impact of these economic shifts and the failures of structural adjustment programmes on the nutritional well-being and health of the most vulnerable members of poor communities cannot be over-emphasised. The use of ad hoc measures as an adjunct to community-based rural integrated projects have provided little success and will be unsustainable unless they are linked to harnessing available local resources. The present paper therefore focuses on exploring alternative ways of harnessing the scant agricultural resources by employing a scientific approach to food-related problem-solving. The food multimix (FMM) concept offers a scientific contribution alongside other attempts currently in use by the World Food Programme, WHO and FAO to meet the food insecurity challenges that confront most of the developing world in the twenty-first century. It is an innovative approach that makes better use of traditional food sources as a tool for meeting community nutritional needs. The FMM concept employs a food-based approach using traditional methods of food preparation and locally-available, cheap and affordable staples (fruits, pulses, vegetables and legumes) in the formulation of nutrient-enriched multimixes. Developed recipes can provide >= 40% of the daily nutritional requirements of vulnerable groups, including patients with HIV/AIDS and children undergoing nutrition rehabilitation. The FMM approach can also be used as a medium- to long-term adjunct to community-based rural integration projects aimed at health improvement and economic empowerment in Sub-Saharan Africa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many food webs are so complex that it is difficult to distinguish the relationships between predators and their prey. We have therefore developed an approach that produces a food web which clearly demonstrates the strengths of the relationships between the predator guilds of demersal fish and their prey guilds in a coastal ecosystem. Subjecting volumetric dietary data for 35 abundant predators along the lower western Australia coast to cluster analysis and the SIMPROF routine separated the various species x length class combinations into 14 discrete predator guilds. Following nMDS ordination, the sequence of points for these predator guilds represented a 'trophic' hierarchy. This demonstrated that, with increasing body size, several species progressed upwards through this hierarchy, reflecting a marked change in diet, whereas others remained within the same guild. A novel use of cluster analysis and SIMPROF then identified each group of prey that was ingested in a common pattern across the full suite of predator guilds. This produced 12 discrete groups of taxa (prey guilds) that each typically comprised similar ecological/functional prey, which were then also aligned in a hierarchy. The hierarchical arrangements of the predator and prey guilds were plotted against each other to show the percentage contribution of each prey guild to the diet of each predator guild. The resultant shade plot demonstrates quantitatively how food resources are spread among the fish species and revealed that two prey guilds, one containing cephalopods and teleosts and the other small benthic/epibenthic crustaceans and polychaetes, were consumed by all predator guilds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides an exposition of Michel Foucault's 'history of the present' in order to make the case for its relevance to the study of social work history. It sets out the general principles underpinning this practice and considers its application to a particular research question relating to history of child welfare and protection social work in the Republic of Ireland. The paper seeks to highlight the challenges involved in its use and illuminate its potential value as an approach for researching the history of social work. It is concluded that this exposition offers one appropriate approach that could be employed within the growing field of social work history research across Europe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper is primarily concerned with the modelling of aircraft manufacturing cost. The aim is to establish an integrated life cycle balanced design process through a systems engineering approach to interdisciplinary analysis and control. The cost modelling is achieved using the genetic causal approach that enforces product family categorisation and the subsequent generation of causal relationships between deterministic cost components and their design source. This utilises causal parametric cost drivers and the definition of the physical architecture from the Work Breakdown Structure (WBS) to identify product families. The paper presents applications to the overall aircraft design with a particular focus on the fuselage as a subsystem of the aircraft, including fuselage panels and localised detail, as well as engine nacelles. The higher level application to aircraft requirements and functional analysis is investigated and verified relative to life cycle design issues for the relationship between acquisition cost and Direct Operational Cost (DOC), for a range of both metal and composite subsystems. Maintenance is considered in some detail as an important contributor to DOC and life cycle cost. The lower level application to aircraft physical architecture is investigated and verified for the WBS of an engine nacelle, including a sequential build stage investigation of the materials, fabrication and assembly costs. The studies are then extended by investigating the acquisition cost of aircraft fuselages, including the recurring unit cost and the non-recurring design cost of the airframe sub-system. The systems costing methodology is facilitated by the genetic causal cost modeling technique as the latter is highly generic, interdisciplinary, flexible, multilevel and recursive in nature, and can be applied at the various analysis levels required of systems engineering. Therefore, the main contribution of paper is a methodology for applying systems engineering costing, supported by the genetic causal cost modeling approach, whether at a requirements, functional or physical level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models of currency competition focus on the 5% of trading attributable to balance-of-payments flows. We introduce an information approach that focuses on the other 95%. Important departures from traditional models arise when transactions convey information. First, prices reveal different information depending on whether trades are direct or though vehicle currencies. Second, missing markets arise due to insufficiently symmetric information, rather than insufficient transactions scale. Third, the indeterminacy of equilibrium that arises in traditional models is resolved: currency trade patterns no longer concentrate arbitrarily on market size. Empirically, we provide a first analysis of transactions across a full market triangle: the euro, yen and US dollar. The estimated transaction effects on prices support the information approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From the late 1860s, opera for Dvorák, along with many composers of the Czech national revival, was an abiding preoccupation. This article examines Dvorak’s relationship with his librettists, his approach to their texts, and the extent to which he was prepared to mould their content. While there is no surviving correspondence between Dvorák and the librettist of his last opera, Jaroslav Vrchlický, a copy of the libretto of Armida with annotations in both Vrchlický’s and Dvorák’s hands was found in 2007 among the writer’s papers. Although Dvorák’s stage sense has often been called into question, it is clear that his interventions in the libretto of Armida, in the first and last acts in particular, show a practical, theatrical approach that did much to enhance the dramatic impact of Armida’s first entry and the final chorus of the opera.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For high-technology entrepreneurs, attaining an appropriate level of investment to support new ventures is challenging as substantial investment is usually required prior to revenue generation. Consequently, entrepreneurs must present their firms as investment ready in the context of an uncertain market response and an absence of any trading history. Gaining tenancy within a business incubator can be advantageous to this process given that placement enhances entrepreneurial contact with potential investors whilst professional client advisors (CAs) use their expertise to assist in the development of a credible business plan. However, for the investment proposal to be successful, it must make sense to fund managers despite their lack of technological expertise and product knowledge. Thus, this article explores how incubator CAs and entrepreneurs act in concert to mould innovative ideas into plausible business plans that make sense to venture fund investors. To illustrate this process, we draw upon empirical evidence which suggests that CAs act as sense makers between venture fund managers (VFMs) and high-technology entrepreneurs, yet their role and influence appears undervalued. These findings have implications for entrepreneurial access to much needed funding and also for the identification of investment opportunities for VFMs. © 2011 Taylor & Francis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today there is a growing interest in the integration of health monitoring applications in portable devices necessitating the development of methods that improve the energy efficiency of such systems. In this paper, we present a systematic approach that enables energy-quality trade-offs in spectral analysis systems for bio-signals, which are useful in monitoring various health conditions as those associated with the heart-rate. To enable such trade-offs, the processed signals are expressed initially in a basis in which significant components that carry most of the relevant information can be easily distinguished from the parts that influence the output to a lesser extent. Such a classification allows the pruning of operations associated with the less significant signal components leading to power savings with minor quality loss since only less useful parts are pruned under the given requirements. To exploit the attributes of the modified spectral analysis system, thresholding rules are determined and adopted at design- and run-time, allowing the static or dynamic pruning of less-useful operations based on the accuracy and energy requirements. The proposed algorithm is implemented on a typical sensor node simulator and results show up-to 82% energy savings when static pruning is combined with voltage and frequency scaling, compared to the conventional algorithm in which such trade-offs were not available. In addition, experiments with numerous cardiac samples of various patients show that such energy savings come with a 4.9% average accuracy loss, which does not affect the system detection capability of sinus-arrhythmia which was used as a test case. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper (co-written with Dr Maria Lohan, Dr Carmel Kelly & Professor Laura Lundy) will describe the ethical review process to undertake health research in the UK, and explain an approach that can help researchers deal with ethical and methodological dilemmas in their research. Ethical review is necessary to ensure researchers and participants are protected, yet the requirement to ‘pass’ numerous committees may be challenging particularly for health researchers who work with vulnerable groups and sensitive topics. The inclusion of these groups/topics is crucial if health researchers are to understand health disparities and implement appropriate interventions with health benefits for vulnerable populations. It is proposed that to overcome ethical and methodological challenges and pitfalls, researchers must implement strategies that advocate for, and increase the participation of, vulnerable populations in health research. A ‘children’s rights based approach’ using participatory methodology will be described that draws on the jurisprudence of international law, (United Nations Convention on the Rights of the Child, 1989) and provides a framework that may empower ethics committees to carry out their function confidently. The role of the researcher, framed within the context of doctoral level study, will be reviewed in terms of the investment required and benefits of utilising this approach. It will be argued that adopting this approach with vulnerable groups, not only guarantees their meaningful participation in the research process and permits their voices to be heard, but also offers ethics committees an internationally agreed upon legal framework, ratified by their governing States, from which to fulfil their obligations and resolve their ethical dilemmas. Increasing the representation and participation of vulnerable groups in health research can inform the development of health policy and practice based on ‘insider knowledge’ that better engages with and more adequately reflects their specific needs. This is likely to yield numerous health, social and economic benefits for all of society through the delivery of more equitable, effective and sustainable services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CCTV (Closed-Circuit TeleVision) systems are broadly deployed in the present world. To ensure in-time reaction for intelligent surveillance, it is a fundamental task for real-world applications to determine the gender of people of interest. However, normal video algorithms for gender profiling (usually face profiling) have three drawbacks. First, the profiling result is always uncertain. Second, the profiling result is not stable. The degree of certainty usually varies over time, sometimes even to the extent that a male is classified as a female, and vice versa. Third, for a robust profiling result in cases that a person’s face is not visible, other features, such as body shape, are required. These algorithms may provide different recognition results - at the very least, they will provide different degrees of certainties. To overcome these problems, in this paper, we introduce an Dempster-Shafer (DS) evidential approach that makes use of profiling results from multiple algorithms over a period of time, in particular, Denoeux’s cautious rule is applied for fusing mass functions through time lines. Experiments show that this approach does provide better results than single profiling results and classic fusion results. Furthermore, it is found that if severe mis-classification has occurred at the beginning of the time line, the combination can yield undesirable results. To remedy this weakness, we further propose three extensions to the evidential approach proposed above incorporating notions of time-window, time-attenuation, and time-discounting, respectively. These extensions also applies Denoeux’s rule along with time lines and take the DS approach as a special case. Experiments show that these three extensions do provide better results than their predecessor when mis-classifications occur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigated the problem of confined flow under dams and water retaining structuresusing stochastic modelling. The approach advocated in the study combined a finite elementsmethod based on the equation governing the dynamics of incompressible fluid flow through aporous medium with a random field generator that generates random hydraulic conductivity basedon lognormal probability distribution. The resulting model was then used to analyse confined flowunder a hydraulic structure. Cases for a structure provided with cutoff wall and when the wall didnot exist were both tested. Various statistical parameters that reflected different degrees ofheterogeneity were examined and the changes in the mean seepage flow, the mean uplift forceand the mean exit gradient observed under the structure were analysed. Results reveal that underheterogeneous conditions, the reduction made by the sheetpile in the uplift force and exit hydraulicgradient may be underestimated when deterministic solutions are used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Online forums are becoming a popular way of finding useful
information on the web. Search over forums for existing discussion
threads so far is limited to keyword-based search due
to the minimal effort required on part of the users. However,
it is often not possible to capture all the relevant context in a
complex query using a small number of keywords. Examplebased
search that retrieves similar discussion threads given
one exemplary thread is an alternate approach that can help
the user provide richer context and vastly improve forum
search results. In this paper, we address the problem of
finding similar threads to a given thread. Towards this, we
propose a novel methodology to estimate similarity between
discussion threads. Our method exploits the thread structure
to decompose threads in to set of weighted overlapping
components. It then estimates pairwise thread similarities
by quantifying how well the information in the threads are
mutually contained within each other using lexical similarities
between their underlying components. We compare our
proposed methods on real datasets against state-of-the-art
thread retrieval mechanisms wherein we illustrate that our
techniques outperform others by large margins on popular
retrieval evaluation measures such as NDCG, MAP, Precision@k
and MRR. In particular, consistent improvements of
up to 10% are observed on all evaluation measures

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of real-time networks is under continuous improvement as a result of several trends in the digital world. However, these tendencies not only cause improvements, but also exacerbates a series of unideal aspects of real-time networks such as communication latency, jitter of the latency and packet drop rate. This Thesis focuses on the communication errors that appear on such realtime networks, from the point-of-view of automatic control. Specifically, it investigates the effects of packet drops in automatic control over fieldbuses, as well as the architectures and optimal techniques for their compensation. Firstly, a new approach to address the problems that rise in virtue of such packet drops, is proposed. This novel approach is based on the simultaneous transmission of several values in a single message. Such messages can be from sensor to controller, in which case they are comprised of several past sensor readings, or from controller to actuator in which case they are comprised of estimates of several future control values. A series of tests reveal the advantages of this approach. The above-explained approach is then expanded as to accommodate the techniques of contemporary optimal control. However, unlike the aforementioned approach, that deliberately does not send certain messages in order to make a more efficient use of network resources; in the second case, the techniques are used to reduce the effects of packet losses. After these two approaches that are based on data aggregation, it is also studied the optimal control in packet dropping fieldbuses, using generalized actuator output functions. This study ends with the development of a new optimal controller, as well as the function, among the generalized functions that dictate the actuator’s behaviour in the absence of a new control message, that leads to the optimal performance. The Thesis also presents a different line of research, related with the output oscillations that take place as a consequence of the use of classic co-design techniques of networked control. The proposed algorithm has the goal of allowing the execution of such classical co-design algorithms without causing an output oscillation that increases the value of the cost function. Such increases may, under certain circumstances, negate the advantages of the application of the classical co-design techniques. A yet another line of research, investigated algorithms, more efficient than contemporary ones, to generate task execution sequences that guarantee that at least a given number of activated jobs will be executed out of every set composed by a predetermined number of contiguous activations. This algorithm may, in the future, be applied to the generation of message transmission patterns in the above-mentioned techniques for the efficient use of network resources. The proposed task generation algorithm is better than its predecessors in the sense that it is capable of scheduling systems that cannot be scheduled by its predecessor algorithms. The Thesis also presents a mechanism that allows to perform multi-path routing in wireless sensor networks, while ensuring that no value will be counted in duplicate. Thereby, this technique improves the performance of wireless sensor networks, rendering them more suitable for control applications. As mentioned before, this Thesis is centered around techniques for the improvement of performance of distributed control systems in which several elements are connected through a fieldbus that may be subject to packet drops. The first three approaches are directly related to this topic, with the first two approaching the problem from an architectural standpoint, whereas the third one does so from more theoretical grounds. The fourth approach ensures that the approaches to this and similar problems that can be found in the literature that try to achieve goals similar to objectives of this Thesis, can do so without causing other problems that may invalidate the solutions in question. Then, the thesis presents an approach to the problem dealt with in it, which is centered in the efficient generation of the transmission patterns that are used in the aforementioned approaches.