989 resultados para ANSWER


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Criminal behavior has been explained by the idea that offenders have a lack of self-control. Yet, Wilson and Daly reported that juvenile offenders exhibit time-discounting tendencies similar to those of nonoffending juveniles. As no previous study has compared time-discounting behavior of adult offenders with nonoffenders, we raise the question, do adult offenders exhibit shorter time horizons or the tendency to discount future rewards? To answer this question, 89 offenders (ex-prisoners and prisoners) and 106 nonoffenders completed a time-discounting measure containing 27 different monetary choices. Our results show that, counter to findings with juvenile offenders, adult offenders (ex-prisoners) exhibit significantly shorter time horizons and discount more than nonoffenders as delayed payoffs increase to medium and large rewards. Furthermore, both offenders and nonoffenders are less likely to discount as the reward of future gains increases to medium and large.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The number of well-dated pollen diagrams in Europe has increased considerably over the last 30 years and many of them have been submitted to the European Pollen Database (EPD). This allows for the construction of increasingly precise maps of Holocene vegetation change across the continent. Chronological information in the EPD has been expressed in uncalibrated radiocarbon years, and most chronologies to date are based on this time scale. Here we present new chronologies for most of the datasets stored in the EPD based on calibrated radiocarbon years. Age information associated with pollen diagrams is often derived from the pollen stratigraphy itself or from other sedimentological information. We reviewed these chronological tie points and assigned uncertainties to them. The steps taken to generate the new chronologies are described and the rationale for a new classification system for age uncertainties is introduced. The resulting chronologies are fit for most continental-scale questions. They may not provide the best age model for particular sites, but may be viewed as general purpose chronologies. Taxonomic particularities of the data stored in the EPD are explained. An example is given of how the database can be queried to select samples with appropriate age control as well as the suitable taxonomic level to answer a specific research question. © 2013 The Author(s).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper reports data from an on-line peer tutoring project. In the project 78, 9–12-year-old students from Scotland and Catalonia peer tutored each other in English and Spanish via a managed on-line envi- ronment. Significant gains in first language (Catalonian pupils) modern language (Scottish pupils) and attitudes towards modern languages (both Catalonian and Scottish pupils) were reported for the exper- imental group as compared to the control group. Results indicated that pupils tutored each other in using Piagetian techniques of error correction during the project. Error correction provided by tutors to tutees focussed on morph syntaxys, more specifically the correction of verbs. Peer support provided via the on- line environment was predominantly based on the tutor giving the right answer to the tutee. High rates of impact on tutee corrected messages were observed. The implications for peer tutoring initiative taking place via on-line environments are discussed. Implications for policy and practice are explored

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In conditional probabilistic logic programming, given a query, the two most common forms for answering the query are either a probability interval or a precise probability obtained by using the maximum entropy principle. The former can be noninformative (e.g.,interval [0; 1]) and the reliability of the latter is questionable when the priori knowledge isimprecise. To address this problem, in this paper, we propose some methods to quantitativelymeasure if a probability interval or a single probability is sufficient for answering a query. We first propose an approach to measuring the ignorance of a probabilistic logic program with respect to a query. The measure of ignorance (w.r.t. a query) reflects howreliable a precise probability for the query can be and a high value of ignorance suggests that a single probability is not suitable for the query. We then propose a method to measure the probability that the exact probability of a query falls in a given interval, e.g., a second order probability. We call it the degree of satisfaction. If the degree of satisfaction is highenough w.r.t. the query, then the given interval can be accepted as the answer to the query. We also prove our measures satisfy many properties and we use a case study to demonstrate the significance of the measures. © Springer Science+Business Media B.V. 2012

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Architects typically interpret Heidegger to mean that dwelling in the Black Forest, was more authentic than living in an industrialised society however we cannot turn back the clock so we are confronted with the reality of modernisation. Since the Second World War production has shifted from material to immaterial assets. Increasingly place is believed to offer resistance to this fluidity, but this belief can conversely be viewed as expressing a sublimated anxiety about our role in the world – the need to create buildings that are self-consciously contextual suggests that we may no longer be rooted in material places, but in immaterial relations.
This issue has been pondered by David Harvey in his paper From Place to Space and Back Again where he argues that the role of place in legitimising identity is ultimately a political process, as the interpretation of its meaning is dependent on whose interpretation it is. Doreen Massey has found that different classes of people are more or less mobile and that mobility is related to class and education rather than to nationality or geography. These thinkers point to a different set of questions than the usual space/place divide – how can we begin to address the economic mediation of spatial production to develop an ethical production of place? Part of the answer is provided by the French architectural practice Lacaton Vassal in their book Plus. They ask themselves how to produce more space for the same cost so that people can enjoy a better quality of life. Another French practitioner, Patrick Bouchain, has argued that architect’s fees should be inversely proportional to the amount of material resources that they consume. These approaches use economics as a starting point for generating architectural form and point to more ethical possibilities for architectural practice

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiuser diversity (MUDiv) is one of the central concepts in multiuser (MU) systems. In particular, MUDiv allows for scheduling among users in order to eliminate the negative effects of unfavorable channel fading conditions of some users on the system performance. Scheduling, however, consumes energy (e.g., for making users' channel state information available to the scheduler). This extra usage of energy, which could potentially be used for data transmission, can be very wasteful, especially if the number of users is large. In this paper, we answer the question of how much MUDiv is required for energy limited MU systems. Focusing on uplink MU wireless systems, we develop MU scheduling algorithms which aim at maximizing the MUDiv gain. Toward this end, we introduce a new realistic energy model which accounts for scheduling energy and describes the distribution of the total energy between scheduling and data transmission stages. Using the fact that such energy distribution can be controlled by varying the number of active users, we optimize this number by either i) minimizing the overall system bit error rate (BER) for a fixed total energy of all users in the system or ii) minimizing the total energy of all users for fixed BER requirements. We find that for a fixed number of available users, the achievable MUDiv gain can be improved by activating only a subset of users. Using asymptotic analysis and numerical simulations, we show that our approach benefits from MUDiv gains higher than that achievable by generic greedy access algorithm, which is the optimal scheduling method for energy unlimited systems. © 2010 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The answer to the question of what it means to say that a right is absolute is often taken for granted, yet still sparks doubt and scepticism. This article investigates absoluteness further, bringing rights theory and the judicial approach on an absolute right together. A theoretical framework is set up that addresses two distinct but potentially related parameters of investigation: the first is what I have labelled the ‘applicability’ criterion, which looks at whether and when the applicability of the standard referred to as absolute can be displaced, in other words whether other considerations can justify its infringement; the second parameter, which I have labelled the ‘specification’ criterion, explores the degree to which and bases on which the content of the standard characterised as absolute is specified. This theoretical framework is then used to assess key principles and issues that arise in the Strasbourg Court’s approach to Article 3. It is suggested that this analysis allows us to explore both the distinction and the interplay between the two parameters in the judicial interpretation of the right and that appreciating the significance of this is fundamental to the understanding of and discourse on the concept of an absolute right.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The crop management practice of alternate wetting and drying (AWD) is being promoted by IRRI and the national research and extension program in Bangladesh and other parts of the world as a water-saving irrigation practice that reduces the environmental impact of dry season rice production through decreased water usage, and potentially increases yield. Evidence is growing that AWD will dramatically reduce the concentration of arsenic in harvested rice grains conferring a third major advantage over permanently flooded dry season rice production. AWD may also increase the concentration of essential dietary micronutrients in the grain. However, three crucial aspects of AWD irrigation require further investigation. First, why is yield generally altered in AWD? Second, is AWD sustainable economically (viability of farmers' livelihoods) and environmentally (aquifer water table heights) over long-term use? Third, are current cultivars optimized for this irrigation system? This paper describes a multidisciplinary research project that could be conceived which would answer these questions by combining advanced soil biogeochemistry with crop physiology, genomics, and systems biology. The description attempts to show how the breakthroughs in next generation sequencing could be exploited to better utilize local collections of germplasm and identify the molecular mechanisms underlying biological adaptation to the environment within the context of soil chemistry and plant physiology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Why do firms pay dividends? To answer this question, we use a hand-collected data set of companies traded on the London stock market between 1825 and 1870. As tax rates were effectively zero, the capital market was unregulated, and there were no institutional stockholders, we can rule out these potential determinants ex ante. We find that, even though they were legal, share repurchases were not used by firms to return cash to shareholders. Instead, our evidence provides support for the information–communication explanation for dividends, while providing little support for agency, illiquidity, catering, or behavioral explanations. © The Authors 2013. Published by Oxford University Press [on behalf of the European Finance Association]. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Protein-protein interactions play a central role in many cellular processes. Their characterisation is necessary in order to analyse these processes and for the functional identification of unknown proteins. Existing detection methods such as the yeast two-hybrid (Y2H) and tandem affinity purification (TAP) method provide a means to answer rapidly questions regarding protein-protein interactions, but have limitations which restrict their use to certain interaction networks; furthermore they provide little information regarding interaction localisation at the subcellular level. The development of protein-fragment complementation assays (PCA) employing a fluorescent reporter such as a member of the green fluorescent protein (GFP) family has led to a new method of interaction detection termed Bimolecular Fluorescent Complementation (BiFC). These assays have become important tools for understanding protein interactions and the development of whole genome interaction maps. BiFC assays have the advantages of very low background signal coupled with rapid detection of protein-protein interactions in vivo while also providing information regarding interaction compartmentalisation. Modified forms of the assay such as the use of combinations of spectral variants of GFP have allowed simultaneous visualisation of multiple competing interactions in vivo. Advantages and disadvantages of the method are discussed in the context of other fluorescence-based interaction monitoring techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electing a leader is a fundamental task in distributed computing. In its implicit version, only the leader must know who is the elected leader. This paper focuses on studying the message and time complexity of randomized implicit leader election in synchronous distributed networks. Surprisingly, the most "obvious" complexity bounds have not been proven for randomized algorithms. The "obvious" lower bounds of O(m) messages (m is the number of edges in the network) and O(D) time (D is the network diameter) are non-trivial to show for randomized (Monte Carlo) algorithms. (Recent results that show that even O(n) (n is the number of nodes in the network) is not a lower bound on the messages in complete networks, make the above bounds somewhat less obvious). To the best of our knowledge, these basic lower bounds have not been established even for deterministic algorithms (except for the limited case of comparison algorithms, where it was also required that some nodes may not wake up spontaneously, and that D and n were not known).

We establish these fundamental lower bounds in this paper for the general case, even for randomized Monte Carlo algorithms. Our lower bounds are universal in the sense that they hold for all universal algorithms (such algorithms should work for all graphs), apply to every D, m, and n, and hold even if D, m, and n are known, all the nodes wake up simultaneously, and the algorithms can make anyuse of node's identities. To show that these bounds are tight, we present an O(m) messages algorithm. An O(D) time algorithm is known. A slight adaptation of our lower bound technique gives rise to an O(m) message lower bound for randomized broadcast algorithms.

An interesting fundamental problem is whether both upper bounds (messages and time) can be reached simultaneously in the randomized setting for all graphs. (The answer is known to be negative in the deterministic setting). We answer this problem partially by presenting a randomized algorithm that matches both complexities in some cases. This already separates (for some cases) randomized algorithms from deterministic ones. As first steps towards the general case, we present several universal leader election algorithms with bounds that trade-off messages versus time. We view our results as a step towards understanding the complexity of universal leader election in distributed networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a pressing need for more-efficient trial designs for biomarker-stratified clinical trials. We suggest a new approach to trial design that links novel treatment evaluation with the concurrent evaluation of a biomarker within a confirmatory phase II/III trial setting. We describe a new protocol using this approach in advanced colorectal cancer called FOCUS4. The protocol will ultimately answer three research questions for a number of treatments and biomarkers: (1) After a period of first-line chemotherapy, do targeted novel therapies provide signals of activity in different biomarker-defined populations? (2) If so, do these definitively improve outcomes? (3) Is evidence of activity restricted to the biomarker-defined groups? The protocol randomizes novel agents against placebo concurrently across a number of different biomarker-defined population-enriched cohorts: BRAF mutation; activated AKT pathway: PI3K mutation/absolute PTEN loss tumors; KRAS and NRAS mutations; and wild type at all the mentioned genes. Within each biomarker-defined population, the trial uses a multistaged approach with flexibility to adapt in response to planned interim analyses for lack of activity. FOCUS4 is the first test of a protocol that assigns all patients with metastatic colorectal cancer to one of a number of parallel population-enriched, biomarker-stratified randomized trials. Using this approach allows questions regarding efficacy and safety of multiple novel therapies to be answered in a relatively quick and efficient manner, while also allowing for the assessment of biomarkers to help target treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundamental ecological research is both intrinsically interesting and provides the basic knowledge required to answer applied questions of importance to the management of the natural world. The 100th anniversary of the British Ecological Society in 2013 is an opportune moment to reflect on the current status of ecology as a science and look forward to high-light priorities for future work. To do this, we identified 100 important questions of fundamental importance in pure ecology. We elicited questions from ecologists working across a wide range of systems and disciplines. The 754 questions submitted (listed in the online appendix) from 388 participants were narrowed down to the final 100 through a process of discussion, rewording and repeated rounds of voting. This was done during a two-day workshop and thereafter. The questions reflect many of the important current conceptual and technical pre-occupations of ecology. For example, many questions concerned the dynamics of environmental change and complex ecosystem interactions, as well as the interaction between ecology and evolution. The questions reveal a dynamic science with novel subfields emerging. For example, a group of questions was dedicated to disease and micro-organisms and another on human impacts and global change reflecting the emergence of new subdisciplines that would not have been foreseen a few decades ago. The list also contained a number of questions that have perplexed ecologists for decades and are still seen as crucial to answer, such as the link between population dynamics and life-history evolution. Synthesis. These 100 questions identified reflect the state of ecology today. Using them as an agenda for further research would lead to a substantial enhancement in understanding of the discipline, with practical relevance for the conservation of biodiversity and ecosystem function. © 2013 The Authors. Journal of Ecology © 2013 British Ecological Society.