831 resultados para Choice-based welfare analysis, bounded rationality


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plant and animal biodiversity can be studied by obtaining DNA directly from the environment. This new approach in combination with the use of generic barcoding primers (metabarcoding) has been suggested as complementary or alternative to traditional biodiversity monitoring in ancient soil sediments. However, the extent to which metabarcoding truly reflects plant composition remains unclear, as does its power to identify species with no pollen or macrofossil evidence. Here, we compared pollen-based and metabarcoding approaches to explore the Holocene plant composition around two lakes in central Scandinavia. At one site, we also compared barcoding results with those obtained in earlier studies with species-specific primers. The pollen analyses revealed a larger number of taxa (46), of which the majority (78%) was not identified by metabarcoding. The metabarcoding identified 14 taxa (MTUs), but allowed identification to a lower taxonomical level. The combined analyses identified 52 taxa. The barcoding primers may favour amplification of certain taxa, as they did not detect taxa previously identified with species-specific primers. Taphonomy and selectiveness of the primers are likely the major factors influencing these results. We conclude that metabarcoding from lake sediments provides a complementary, but not an alternative, tool to pollen analysis for investigating past flora. In the absence of other fossil evidence, metabarcoding gives a local and important signal from the vegetation, but the resulting assemblages show limited capacity to detect all taxa, regardless of their abundance around the lake. We suggest that metabarcoding is followed by pollen analysis and the use of species-specific primers to provide the most comprehensive signal from the environment. © 2013 Blackwell Publishing Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a comparative newly-invented PKM with over-constraints in kinematic chains, the Exechon has attracted extensive attention from the research society. Different from the well-recognized kinematics analysis, the research on the stiffness characteristics of the Exechon still remains as a challenge due to the structural complexity. In order to achieve a thorough understanding of the stiffness characteristics of the Exechon PKM, this paper proposed an analytical kinetostatic model by using the substructure synthesis technique. The whole PKM system is decomposed into a moving platform subsystem, three limb subsystems and a fixed base subsystem, which are connected to each other sequentially through corresponding joints. Each limb body is modeled as a spatial beam with a uniform cross-section constrained by two sets of lumped springs. The equilibrium equation of each individual limb assemblage is derived through finite element formulation and combined with that of the moving platform derived with Newtonian method to construct the governing kinetostatic equations of the system after introducing the deformation compatibility conditions between the moving platform and the limbs. By extracting the 6 x 6 block matrix from the inversion of the governing compliance matrix, the stiffness of the moving platform is formulated. The computation for the stiffness of the Exechon PKM at a typical configuration as well as throughout the workspace is carried out in a quick manner with a piece-by-piece partition algorithm. The numerical simulations reveal a strong position-dependency of the PKM's stiffness in that it is symmetric relative to a work plane due to structural features. At the last stage, the effects of some design variables such as structural, dimensional and stiffness parameters on system rigidity are investigated with the purpose of providing useful information for the structural optimization and performance enhancement of the Exechon PKM. It is worthy mentioning that the proposed methodology of stiffness modeling in this paper can also be applied to other overconstrained PKMs and can evaluate the global rigidity over workplace efficiently with minor revisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Social media channels, such as Facebook or Twitter, allow for people to express their views and opinions about any public topics. Public sentiment related to future events, such as demonstrations or parades, indicate public attitude and therefore may be applied while trying to estimate the level of disruption and disorder during such events. Consequently, sentiment analysis of social media content may be of interest for different organisations, especially in security and law enforcement sectors. This paper presents a new lexicon-based sentiment analysis algorithm that has been designed with the main focus on real time Twitter content analysis. The algorithm consists of two key components, namely sentiment normalisation and evidence-based combination function, which have been used in order to estimate the intensity of the sentiment rather than positive/negative label and to support the mixed sentiment classification process. Finally, we illustrate a case study examining the relation between negative sentiment of twitter posts related to English Defence League and the level of disorder during the organisation’s related events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a novel approach to explore DNA nucleotide sequence data, aiming to produce high-level categorical and structural information about the underlying chromosomes, genomes and species. The article starts by analyzing chromosomal data through histograms using fixed length DNA sequences. After creating the DNA-related histograms, a correlation between pairs of histograms is computed, producing a global correlation matrix. These data are then used as input to several data processing methods for information extraction and tabular/graphical output generation. A set of 18 species is processed and the extensive results reveal that the proposed method is able to generate significant and diversified outputs, in good accordance with current scientific knowledge in domains such as genomics and phylogenetics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image analysis and graphics synthesis can be achieved with learning techniques using directly image examples without physically-based, 3D models. In our technique: -- the mapping from novel images to a vector of "pose" and "expression" parameters can be learned from a small set of example images using a function approximation technique that we call an analysis network; -- the inverse mapping from input "pose" and "expression" parameters to output images can be synthesized from a small set of example images and used to produce new images using a similar synthesis network. The techniques described here have several applications in computer graphics, special effects, interactive multimedia and very low bandwidth teleconferencing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic association analyses of family-based studies with ordered categorical phenotypes are often conducted using methods either for quantitative or for binary traits, which can lead to suboptimal analyses. Here we present an alternative likelihood-based method of analysis for single nucleotide polymorphism (SNP) genotypes and ordered categorical phenotypes in nuclear families of any size. Our approach, which extends our previous work for binary phenotypes, permits straightforward inclusion of covariate, gene-gene and gene-covariate interaction terms in the likelihood, incorporates a simple model for ascertainment and allows for family-specific effects in the hypothesis test. Additionally, our method produces interpretable parameter estimates and valid confidence intervals. We assess the proposed method using simulated data, and apply it to a polymorphism in the c-reactive protein (CRP) gene typed in families collected to investigate human systemic lupus erythematosus. By including sex interactions in the analysis, we show that the polymorphism is associated with anti-nuclear autoantibody (ANA) production in females, while there appears to be no effect in males.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop a transaction cost economics theory of the family firm, building upon the concepts of family-based asset specificity, bounded rationality, and bounded reliability. We argue that the prosperity and survival of family firms depend on the absence of a dysfunctional bifurcation bias. The bifurcation bias is an expression of bounded reliability, reflected in the de facto asymmetric treatment of family vs. nonfamily assets (especially human assets). We propose that absence of bifurcation bias is critical to fostering reliability in family business functioning. Our study ends the unproductive divide between the agency and stewardship perspectives of the family firm, which offer conflicting accounts of this firm type's functioning. We show that the predictions of the agency and stewardship perspectives can be usefully reconciled when focusing on how family firms address the bifurcation bias or fail to do so.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aspect-oriented programming (AOP) is a promising technology that supports separation of crosscutting concerns (i.e., functionality that tends to be tangled with, and scattered through the rest of the system). In AOP, a method-like construct named advice is applied to join points in the system through a special construct named pointcut. This mechanism supports the modularization of crosscutting behavior; however, since the added interactions are not explicit in the source code, it is hard to ensure their correctness. To tackle this problem, this paper presents a rigorous coverage analysis approach to ensure exercising the logic of each advice - statements, branches, and def-use pairs - at each affected join point. To make this analysis possible, a structural model based on Java bytecode - called PointCut-based Del-Use Graph (PCDU) - is proposed, along with three integration testing criteria. Theoretical, empirical, and exploratory studies involving 12 aspect-oriented programs and several fault examples present evidence of the feasibility and effectiveness of the proposed approach. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last years, regulating agencies of rnany countries in the world, following recommendations of the Basel Committee, have compelled financiaI institutions to maintain minimum capital requirements to cover market risk. This paper investigates the consequences of such kind of regulation to social welfare and soundness of financiaI institutions through an equilibrium model. We show that the optimum level of regulation for each financiaI institution (the level that maximizes its utility) depends on its appetite for risk and some of them can perform better in a regulated economy. In addition, another important result asserts that under certain market conditions the financiaI fragility of an institution can be greater in a regulated econolny than in an unregulated one