182 resultados para Minimal Defining Set
Resumo:
The liberalization of international trade and foreign direct investment through multilateral, regional and bilateral agreements has had profound implications for the structure and nature of food systems, and therefore, for the availability, nutritional quality, accessibility, price and promotion of foods in different locations. Public health attention has only relatively recently turned to the links between trade and investment agreements, diets and health, and there is currently no systematic monitoring of this area. This paper reviews the available evidence on the links between trade agreements, food environments and diets from an obesity and non-communicable disease (NCD) perspective. Based on the key issues identified through the review, the paper outlines an approach for monitoring the potential impact of trade agreements on food environments and obesity/NCD risks. The proposed monitoring approach encompasses a set of guiding principles, recommended procedures for data collection and analysis, and quantifiable ‘minimal’, ‘expanded’ and ‘optimal’ measurement indicators to be tailored to national priorities, capacity and resources. Formal risk assessment processes of existing and evolving trade and investment agreements, which focus on their impacts on food environments will help inform the development of healthy trade policy, strengthen domestic nutrition and health policy space and ultimately protect population nutrition.
Resumo:
Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.
Resumo:
Food literacy has emerged as a term to describe the everyday practicalities associated with healthy eating. The term is increasingly used in policy, practice, research and by the public; however, there is no shared understanding of its meaning. The purpose of this research was to develop a definition of food literacy which was informed by the identification of its components. This was considered from two perspectives: that of food experts which aimed to reflect the intention of existing policy and investment, and that of individuals, who could be considered experts in the everyday practicalities of food provisioning and consumption. Given that food literacy is likely to be highly contextual, this second study focused on disadvantaged young people living in an urban area who were responsible for feeding themselves. The Expert Study used a Delphi methodology (round one n = 43). The Young People’s Study used semi-structured, life-course interviews (n = 37). Constructivist Grounded Theory was used to analyse results. This included constant comparison of data within and between studies. From this, eleven components of food literacy were identified which fell into the domains of: planning and management; selection; preparation; and eating. These were used to develop a definition for the term “food literacy”.
Resumo:
This work presents a demand side response model (DSR) which assists small electricity consumers, through an aggregator, exposed to the market price to proactively mitigate price and peak impact on the electrical system. The proposed model allows consumers to manage air-conditioning when as a function of possible price spikes. The main contribution of this research is to demonstrate how consumers can minimise the total expected cost by optimising air-conditioning to account for occurrences of a price spike in the electricity market. This model investigates how pre-cooling method can be used to minimise energy costs when there is a substantial risk of an electricity price spike. The model was tested with Queensland electricity market data from the Australian Energy Market Operator and Brisbane temperature data from the Bureau of Statistics during hot days on weekdays in the period 2011 to 2012.
Resumo:
This paper explores the concept of expertise in intensive care nursing practice from the perspective of its relationship to the current driving forces in healthcare. It discusses the potential barriers to acceptance of nursing expertise in a climate in which quantification of value and cost containment run high on agendas. It argues that nursing expertise which focuses on the provision of individualised, holistic care and which is based largely on intuitive decision-making cannot and should not be reduced to being articulated in positivist terms. The principles of abduction or fuzzy logic, derived from computer science, may be useful in assisting nurses to explain in terms, which others can comprehend, the value of nursing expertise.
Resumo:
In Thomas Mann’s tetralogy of the 1930s and 1940s, Joseph and His Brothers, the narrator declares history is not only “that which has happened and that which goes on happening in time,” but it is also “the stratified record upon which we set our feet, the ground beneath us.” By opening up history to its spatial, geographical, and geological dimensions Mann both predicts and encapsulates the twentieth-century’s “spatial turn,” a critical shift that divested geography of its largely passive role as history’s “stage” and brought to the fore intersections between the humanities and the earth sciences. In this paper, I draw out the relationships between history, narrative, geography, and geology revealed by this spatial turn and the questions these pose for thinking about the disciplinary relationship between geography and the humanities. As Mann’s statement exemplifies, the spatial turn itself has often been captured most strikingly in fiction, and I would argue nowhere more so than in Graham Swift’s Waterland (1983) and Anne Michaels’s Fugitive Pieces (1996), both of which present space, place, and landscape as having a palpable influence on history and memory. The geographical/geological line that runs through both Waterland and Fugitive Pieces continues through Tim Robinson’s non-fictional, two-volume “topographical” history Stones of Aran. Robinson’s Stones of Aran—which is not history, not geography, and not literature, and yet is all three—constructs an imaginative geography that renders inseparable geography, geology, history, memory, and the act of writing.
Resumo:
Numeric set watermarking is a way to provide ownership proof for numerical data. Numerical data can be considered to be primitives for multimedia types such as images and videos since they are organized forms of numeric information. Thereby, the capability to watermark numerical data directly implies the capability to watermark multimedia objects and discourage information theft on social networking sites and the Internet in general. Unfortunately, there has been very limited research done in the field of numeric set watermarking due to underlying limitations in terms of number of items in the set and LSBs in each item available for watermarking. In 2009, Gupta et al. proposed a numeric set watermarking model that embeds watermark bits in the items of the set based on a hash value of the items’ most significant bits (MSBs). If an item is chosen for watermarking, a watermark bit is embedded in the least significant bits, and the replaced bit is inserted in the fractional value to provide reversibility. The authors show their scheme to be resilient against the traditional subset addition, deletion, and modification attacks as well as secondary watermarking attacks. In this paper, we present a bucket attack on this watermarking model. The attack consists of creating buckets of items with the same MSBs and determine if the items of the bucket carry watermark bits. Experimental results show that the bucket attack is very strong and destroys the entire watermark with close to 100% success rate. We examine the inherent weaknesses in the watermarking model of Gupta et al. that leave it vulnerable to the bucket attack and propose potential safeguards that can provide resilience against this attack.
Resumo:
Motivated by the need of private set operations in a distributed environment, we extend the two-party private matching problem proposed by Freedman, Nissim and Pinkas (FNP) at Eurocrypt’04 to the distributed setting. By using a secret sharing scheme, we provide a distributed solution of the FNP private matching called the distributed private matching. In our distributed private matching scheme, we use a polynomial to represent one party’s dataset as in FNP and then distribute the polynomial to multiple servers. We extend our solution to the distributed set intersection and the cardinality of the intersection, and further we show how to apply the distributed private matching in order to compute distributed subset relation. Our work extends the primitives of private matching and set intersection by Freedman et al. Our distributed construction might be of great value when the dataset is outsourced and its privacy is the main concern. In such cases, our distributed solutions keep the utility of those set operations while the dataset privacy is not compromised. Comparing with previous works, we achieve a more efficient solution in terms of computation. All protocols constructed in this paper are provably secure against a semi-honest adversary under the Decisional Diffie-Hellman assumption.
Resumo:
This paper proposes a method for designing set-point regulation controllers for a class of underactuated mechanical systems in Port-Hamiltonian System (PHS) form. A new set of potential shape variables in closed loop is proposed, which can replace the set of open loop shape variables-the configuration variables that appear in the kinetic energy. With this choice, the closed-loop potential energy contains free functions of the new variables. By expressing the regulation objective in terms of these new potential shape variables, the desired equilibrium can be assigned and there is freedom to reshape the potential energy to achieve performance whilst maintaining the PHS form in closed loop. This complements contemporary results in the literature, which preserve the open-loop shape variables. As a case study, we consider a robotic manipulator mounted on a flexible base and compensate for the motion of the base while positioning the end effector with respect to the ground reference. We compare the proposed control strategy with special cases that correspond to other energy shaping strategies previously proposed in the literature.
Resumo:
Background Flavonoids such as anthocyanins, flavonols and proanthocyanidins, play a central role in fruit colour, flavour and health attributes. In peach and nectarine (Prunus persica) these compounds vary during fruit growth and ripening. Flavonoids are produced by a well studied pathway which is transcriptionally regulated by members of the MYB and bHLH transcription factor families. We have isolated nectarine flavonoid regulating genes and examined their expression patterns, which suggests a critical role in the regulation of flavonoid biosynthesis. Results In nectarine, expression of the genes encoding enzymes of the flavonoid pathway correlated with the concentration of proanthocyanidins, which strongly increases at mid-development. In contrast, the only gene which showed a similar pattern to anthocyanin concentration was UDP-glucose-flavonoid-3-O-glucosyltransferase (UFGT), which was high at the beginning and end of fruit growth, remaining low during the other developmental stages. Expression of flavonol synthase (FLS1) correlated with flavonol levels, both temporally and in a tissue specific manner. The pattern of UFGT gene expression may be explained by the involvement of different transcription factors, which up-regulate flavonoid biosynthesis (MYB10, MYB123, and bHLH3), or repress (MYB111 and MYB16) the transcription of the biosynthetic genes. The expression of a potential proanthocyanidin-regulating transcription factor, MYBPA1, corresponded with proanthocyanidin levels. Functional assays of these transcription factors were used to test the specificity for flavonoid regulation. Conclusions MYB10 positively regulates the promoters of UFGT and dihydroflavonol 4-reductase (DFR) but not leucoanthocyanidin reductase (LAR). In contrast, MYBPA1 trans-activates the promoters of DFR and LAR, but not UFGT. This suggests exclusive roles of anthocyanin regulation by MYB10 and proanthocyanidin regulation by MYBPA1. Further, these transcription factors appeared to be responsive to both developmental and environmental stimuli.
Resumo:
Guanxi has become a common term in the wider business environment and has attracted the increasing attention of researchers. Despite this, a consistent understanding of the concept continues to prove elusive. We review the extant business literature to highlight the major inconsistencies in the way guanxi is currently conceptualized: the breadth, linguistic-cultural depth, temporality, and level of analysis. We conclude with a clearer conceptualization of guanxi which separates the core elements from antecedents and consequences of guanxi. Furthermore, we compare and contrast guanxi with western correlates such as social networks and social capitals to further consolidate our understanding of guanxi.
Accelerometer data reduction : a comparison of four reduction algorithms on select outcome variables
Resumo:
Purpose Accelerometers are recognized as a valid and objective tool to assess free-living physical activity. Despite the widespread use of accelerometers, there is no standardized way to process and summarize data from them, which limits our ability to compare results across studies. This paper a) reviews decision rules researchers have used in the past, b) compares the impact of using different decision rules on a common data set, and c) identifies issues to consider for accelerometer data reduction. Methods The methods sections of studies published in 2003 and 2004 were reviewed to determine what decision rules previous researchers have used to identify wearing period, minimal wear requirement for a valid day, spurious data, number of days used to calculate the outcome variables, and extract bouts of moderate to vigorous physical activity (MVPA). For this study, four data reduction algorithms that employ different decision rules were used to analyze the same data set. Results The review showed that among studies that reported their decision rules, much variability was observed. Overall, the analyses suggested that using different algorithms impacted several important outcome variables. The most stringent algorithm yielded significantly lower wearing time, the lowest activity counts per minute and counts per day, and fewer minutes of MVPA per day. An exploratory sensitivity analysis revealed that the most stringent inclusion criterion had an impact on sample size and wearing time, which in turn affected many outcome variables. Conclusions These findings suggest that the decision rules employed to process accelerometer data have a significant impact on important outcome variables. Until guidelines are developed, it will remain difficult to compare findings across studies
Resumo:
There’s a diagram that does the rounds online that neatly sums up the difference between the quality of equipment used in the studio to produce music, and the quality of the listening equipment used by the consumer...
Resumo:
Existing multi-model approaches for image set classification extract local models by clustering each image set individually only once, with fixed clusters used for matching with other image sets. However, this may result in the two closest clusters to represent different characteristics of an object, due to different undesirable environmental conditions (such as variations in illumination and pose). To address this problem, we propose to constrain the clustering of each query image set by forcing the clusters to have resemblance to the clusters in the gallery image sets. We first define a Frobenius norm distance between subspaces over Grassmann manifolds based on reconstruction error. We then extract local linear subspaces from a gallery image set via sparse representation. For each local linear subspace, we adaptively construct the corresponding closest subspace from the samples of a probe image set by joint sparse representation. We show that by minimising the sparse representation reconstruction error, we approach the nearest point on a Grassmann manifold. Experiments on Honda, ETH-80 and Cambridge-Gesture datasets show that the proposed method consistently outperforms several other recent techniques, such as Affine Hull based Image Set Distance (AHISD), Sparse Approximated Nearest Points (SANP) and Manifold Discriminant Analysis (MDA).
Resumo:
Epithelial-mesenchymal transition (EMT) is a feature of migratory cellular processes in all stages of life, including embryonic development and wound healing. Importantly, EMT features cluster with disease states such as chronic fibrosis and cancer. The dissolution of the E-cadherin-mediated adherens junction (AJ) is a key preliminary step in EMT and may occur early or late in the growing epithelial tumour. This is a first step for tumour cells towards stromal invasion, intravasation, extravasation and distant metastasis. The AJ may be inactivated in EMT by directed E-cadherin cleavage; however, it is increasingly evident that the majority of AJ changes are transcriptional and mediated by an expanding group of transcription factors acting directly or indirectly to repress E-cadherin expression. A review of the current literature has revealed that these factors may regulate each other in a hierarchical pattern where Snail1 (formerly Snail) and Snail2 (formerly Slug) are initially induced, leading to the activation of Zeb family members, TCF3, TCF4, Twist, Goosecoid and FOXC2. Within this general pathway, many inter-regulatory relationships have been defined which may be important in maintaining the EMT phenotype. This may be important given the short half-life of Snail1 protein. We have investigated these inter-regulatory relationships in the mesenchymal breast carcinoma cell line PMC42 (also known as PMC42ET) and its epithelial derivative, PMC42LA. This review also discusses several newly described regulators of E-cadherin repressors including oestrogen receptor-α and new discoveries in hypoxia- and growth factor-induced EMT. Finally, we evaluated how these findings may influence approaches to current cancer treatment.