99 resultados para Minimal Set


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Visitors to prison are generally innocent of committing crime, but their interaction with inmates has been studied as a possible incentive to reduce recidivism. The way visitors’ centres are currently designed takes in consideration mainly security principles and the needs of guards or prison management. The human experience of the relatives or friends aiming to provide emotional support to inmates is usually not considered; facilities have been designed with an approach that often discourages people from visiting. This paper discusses possible principles to design prison visitors’ centres taking in consideration practical needs, but also human factors. A comparative case study analysis of different secure typologies, like libraries, airports or children hospitals, provides suggestions about how to approach the design of prison in order to ensure the visitor is not punished for the crimes of those they are visiting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question as to whether poser race affects the happy categorization advantage, the faster categorization of happy than of negative emotional expressions, has been answered inconsistently. Hugenberg (2005) found the happy categorization advantage only for own race faces whereas faster categorization of angry expressions was evident for other race faces. Kubota and Ito (2007) found a happy categorization advantage for both own race and other race faces. These results have vastly different implications for understanding the influence of race cues on the processing of emotional expressions. The current study replicates the results of both prior studies and indicates that face type (computer-generated vs. photographic), presentation duration, and especially stimulus set size influence the happy categorization advantage as well as the moderating effect of poser race.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As of June 2009, 361 genome-wide association studies (GWAS) had been referenced by the HuGE database. GWAS require DNA from many thousands of individuals, relying on suitable DNA collections. We recently performed a multiple sclerosis (MS) GWAS where a substantial component of the cases (24%) had DNA derived from saliva. Genotyping was done on the Illumina genotyping platform using the Infinium Hap370CNV DUO microarray. Additionally, we genotyped 10 individuals in duplicate using both saliva- and blood-derived DNA. The performance of blood- versus saliva-derived DNA was compared using genotyping call rate, which reflects both the quantity and quality of genotyping per sample and the “GCScore,” an Illumina genotyping quality score, which is a measure of DNA quality. We also compared genotype calls and GCScores for the 10 sample pairs. Call rates were assessed for each sample individually. For the GWAS samples, we compared data according to source of DNA and center of origin. We observed high concordance in genotyping quality and quantity between the paired samples and minimal loss of quality and quantity of DNA in the saliva samples in the large GWAS sample, with the blood samples showing greater variation between centers of origin. This large data set highlights the usefulness of saliva DNA for genotyping, especially in high-density single-nucleotide polymorphism microarray studies such as GWAS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several fringing coral reefs in Moreton Bay, Southeast Queensland, some 300 km south of the Great Barrier Reef (GBR), are set in a relatively high latitude, estuarine environment that is considered marginal for coral growth. Previous work indicated that these marginal reefs, as with many fringing reefs of the inner GBR, ceased accreting in the mid-Holocene. This research presents for the first time data from the subsurface profile of the mid-Holocene fossil reef at Wellington Point comprising U/Th dates of in situ and framework corals, and trace element analysis from the age constrained carbonate fragments. Based on trace element proxies the palaeo-water quality during reef accretion was reconstructed. Results demonstrate that the reef initiated more than 7,000 yr BP during the post glacial transgression, and the initiation progressed to the west as sea level rose. In situ micro-atolls indicate that sea level was at least 1 m above present mean sea level by 6,680 years ago. The reef remained in "catch-up" mode, with a seaward sloping upper surface, until it stopped aggrading abruptly at ca 6,000 yr BP; no lateral progradation occurred. Changes in sediment composition encountered in the cores suggest that after the laterite substrate was covered by the reef, most of the sediment was produced by the carbonate factory with minimal terrigenous influence. Rare earth element, Y and Ba proxies indicate that water quality during reef accretion was similar to oceanic waters, considered suitable for coral growth. A slight decline in water quality on the basis of increased Ba in the later stages of growth may be related to increased riverine input and partial closing up of the bay due to either tidal delta progradation, climatic change and/or slight sea level fall. The age data suggest that termination of reef growth coincided with a slight lowering of sea level, activation of ENSO and consequent increase in seasonality, lowering of temperatures and the constrictions to oceanic flushing. At the cessation of reef accretion the environmental conditions in the western Moreton Bay were changing from open marine to estuarine. The living coral community appears to be similar to the fossil community, but without the branching Acropora spp. that were more common in the fossil reef. In this marginal setting coral growth periods do not always correspond to periods of reef accretion due to insufficient coral abundance. Due to several environmental constraints modern coral growth is insufficient for reef growth. Based on these findings Moreton Bay may be unsuitable as a long term coral refuge for most species currently living in the GBR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The liberalization of international trade and foreign direct investment through multilateral, regional and bilateral agreements has had profound implications for the structure and nature of food systems, and therefore, for the availability, nutritional quality, accessibility, price and promotion of foods in different locations. Public health attention has only relatively recently turned to the links between trade and investment agreements, diets and health, and there is currently no systematic monitoring of this area. This paper reviews the available evidence on the links between trade agreements, food environments and diets from an obesity and non-communicable disease (NCD) perspective. Based on the key issues identified through the review, the paper outlines an approach for monitoring the potential impact of trade agreements on food environments and obesity/NCD risks. The proposed monitoring approach encompasses a set of guiding principles, recommended procedures for data collection and analysis, and quantifiable ‘minimal’, ‘expanded’ and ‘optimal’ measurement indicators to be tailored to national priorities, capacity and resources. Formal risk assessment processes of existing and evolving trade and investment agreements, which focus on their impacts on food environments will help inform the development of healthy trade policy, strengthen domestic nutrition and health policy space and ultimately protect population nutrition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Thomas Mann’s tetralogy of the 1930s and 1940s, Joseph and His Brothers, the narrator declares history is not only “that which has happened and that which goes on happening in time,” but it is also “the stratified record upon which we set our feet, the ground beneath us.” By opening up history to its spatial, geographical, and geological dimensions Mann both predicts and encapsulates the twentieth-century’s “spatial turn,” a critical shift that divested geography of its largely passive role as history’s “stage” and brought to the fore intersections between the humanities and the earth sciences. In this paper, I draw out the relationships between history, narrative, geography, and geology revealed by this spatial turn and the questions these pose for thinking about the disciplinary relationship between geography and the humanities. As Mann’s statement exemplifies, the spatial turn itself has often been captured most strikingly in fiction, and I would argue nowhere more so than in Graham Swift’s Waterland (1983) and Anne Michaels’s Fugitive Pieces (1996), both of which present space, place, and landscape as having a palpable influence on history and memory. The geographical/geological line that runs through both Waterland and Fugitive Pieces continues through Tim Robinson’s non-fictional, two-volume “topographical” history Stones of Aran. Robinson’s Stones of Aran—which is not history, not geography, and not literature, and yet is all three—constructs an imaginative geography that renders inseparable geography, geology, history, memory, and the act of writing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numeric set watermarking is a way to provide ownership proof for numerical data. Numerical data can be considered to be primitives for multimedia types such as images and videos since they are organized forms of numeric information. Thereby, the capability to watermark numerical data directly implies the capability to watermark multimedia objects and discourage information theft on social networking sites and the Internet in general. Unfortunately, there has been very limited research done in the field of numeric set watermarking due to underlying limitations in terms of number of items in the set and LSBs in each item available for watermarking. In 2009, Gupta et al. proposed a numeric set watermarking model that embeds watermark bits in the items of the set based on a hash value of the items’ most significant bits (MSBs). If an item is chosen for watermarking, a watermark bit is embedded in the least significant bits, and the replaced bit is inserted in the fractional value to provide reversibility. The authors show their scheme to be resilient against the traditional subset addition, deletion, and modification attacks as well as secondary watermarking attacks. In this paper, we present a bucket attack on this watermarking model. The attack consists of creating buckets of items with the same MSBs and determine if the items of the bucket carry watermark bits. Experimental results show that the bucket attack is very strong and destroys the entire watermark with close to 100% success rate. We examine the inherent weaknesses in the watermarking model of Gupta et al. that leave it vulnerable to the bucket attack and propose potential safeguards that can provide resilience against this attack.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivated by the need of private set operations in a distributed environment, we extend the two-party private matching problem proposed by Freedman, Nissim and Pinkas (FNP) at Eurocrypt’04 to the distributed setting. By using a secret sharing scheme, we provide a distributed solution of the FNP private matching called the distributed private matching. In our distributed private matching scheme, we use a polynomial to represent one party’s dataset as in FNP and then distribute the polynomial to multiple servers. We extend our solution to the distributed set intersection and the cardinality of the intersection, and further we show how to apply the distributed private matching in order to compute distributed subset relation. Our work extends the primitives of private matching and set intersection by Freedman et al. Our distributed construction might be of great value when the dataset is outsourced and its privacy is the main concern. In such cases, our distributed solutions keep the utility of those set operations while the dataset privacy is not compromised. Comparing with previous works, we achieve a more efficient solution in terms of computation. All protocols constructed in this paper are provably secure against a semi-honest adversary under the Decisional Diffie-Hellman assumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a method for designing set-point regulation controllers for a class of underactuated mechanical systems in Port-Hamiltonian System (PHS) form. A new set of potential shape variables in closed loop is proposed, which can replace the set of open loop shape variables-the configuration variables that appear in the kinetic energy. With this choice, the closed-loop potential energy contains free functions of the new variables. By expressing the regulation objective in terms of these new potential shape variables, the desired equilibrium can be assigned and there is freedom to reshape the potential energy to achieve performance whilst maintaining the PHS form in closed loop. This complements contemporary results in the literature, which preserve the open-loop shape variables. As a case study, we consider a robotic manipulator mounted on a flexible base and compensate for the motion of the base while positioning the end effector with respect to the ground reference. We compare the proposed control strategy with special cases that correspond to other energy shaping strategies previously proposed in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Flavonoids such as anthocyanins, flavonols and proanthocyanidins, play a central role in fruit colour, flavour and health attributes. In peach and nectarine (Prunus persica) these compounds vary during fruit growth and ripening. Flavonoids are produced by a well studied pathway which is transcriptionally regulated by members of the MYB and bHLH transcription factor families. We have isolated nectarine flavonoid regulating genes and examined their expression patterns, which suggests a critical role in the regulation of flavonoid biosynthesis. Results In nectarine, expression of the genes encoding enzymes of the flavonoid pathway correlated with the concentration of proanthocyanidins, which strongly increases at mid-development. In contrast, the only gene which showed a similar pattern to anthocyanin concentration was UDP-glucose-flavonoid-3-O-glucosyltransferase (UFGT), which was high at the beginning and end of fruit growth, remaining low during the other developmental stages. Expression of flavonol synthase (FLS1) correlated with flavonol levels, both temporally and in a tissue specific manner. The pattern of UFGT gene expression may be explained by the involvement of different transcription factors, which up-regulate flavonoid biosynthesis (MYB10, MYB123, and bHLH3), or repress (MYB111 and MYB16) the transcription of the biosynthetic genes. The expression of a potential proanthocyanidin-regulating transcription factor, MYBPA1, corresponded with proanthocyanidin levels. Functional assays of these transcription factors were used to test the specificity for flavonoid regulation. Conclusions MYB10 positively regulates the promoters of UFGT and dihydroflavonol 4-reductase (DFR) but not leucoanthocyanidin reductase (LAR). In contrast, MYBPA1 trans-activates the promoters of DFR and LAR, but not UFGT. This suggests exclusive roles of anthocyanin regulation by MYB10 and proanthocyanidin regulation by MYBPA1. Further, these transcription factors appeared to be responsive to both developmental and environmental stimuli.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Accelerometers are recognized as a valid and objective tool to assess free-living physical activity. Despite the widespread use of accelerometers, there is no standardized way to process and summarize data from them, which limits our ability to compare results across studies. This paper a) reviews decision rules researchers have used in the past, b) compares the impact of using different decision rules on a common data set, and c) identifies issues to consider for accelerometer data reduction. Methods The methods sections of studies published in 2003 and 2004 were reviewed to determine what decision rules previous researchers have used to identify wearing period, minimal wear requirement for a valid day, spurious data, number of days used to calculate the outcome variables, and extract bouts of moderate to vigorous physical activity (MVPA). For this study, four data reduction algorithms that employ different decision rules were used to analyze the same data set. Results The review showed that among studies that reported their decision rules, much variability was observed. Overall, the analyses suggested that using different algorithms impacted several important outcome variables. The most stringent algorithm yielded significantly lower wearing time, the lowest activity counts per minute and counts per day, and fewer minutes of MVPA per day. An exploratory sensitivity analysis revealed that the most stringent inclusion criterion had an impact on sample size and wearing time, which in turn affected many outcome variables. Conclusions These findings suggest that the decision rules employed to process accelerometer data have a significant impact on important outcome variables. Until guidelines are developed, it will remain difficult to compare findings across studies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There’s a diagram that does the rounds online that neatly sums up the difference between the quality of equipment used in the studio to produce music, and the quality of the listening equipment used by the consumer...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing multi-model approaches for image set classification extract local models by clustering each image set individually only once, with fixed clusters used for matching with other image sets. However, this may result in the two closest clusters to represent different characteristics of an object, due to different undesirable environmental conditions (such as variations in illumination and pose). To address this problem, we propose to constrain the clustering of each query image set by forcing the clusters to have resemblance to the clusters in the gallery image sets. We first define a Frobenius norm distance between subspaces over Grassmann manifolds based on reconstruction error. We then extract local linear subspaces from a gallery image set via sparse representation. For each local linear subspace, we adaptively construct the corresponding closest subspace from the samples of a probe image set by joint sparse representation. We show that by minimising the sparse representation reconstruction error, we approach the nearest point on a Grassmann manifold. Experiments on Honda, ETH-80 and Cambridge-Gesture datasets show that the proposed method consistently outperforms several other recent techniques, such as Affine Hull based Image Set Distance (AHISD), Sparse Approximated Nearest Points (SANP) and Manifold Discriminant Analysis (MDA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In CB Richard Ellis (C) Pty Ltd v Wingate Properties Pty Ltd [2005] QDC 399 McGill DCJ examined whether the court now has a discretion to set aside an irregularly entered default judgment.