918 resultados para Kaski, Antti: The security complex: a theoretical analysis and the Baltic case
Resumo:
Neuroimaging (NI) technologies are having increasing impact in the study of complex cognitive and social processes. In this emerging field of social cognitive neuroscience, a central goal should be to increase the understanding of the interaction between the neurobiology of the individual and the environment in which humans develop and function. The study of sex/gender is often a focus for NI research, and may be motivated by a desire to better understand general developmental principles, mental health problems that show female-male disparities, and gendered differences in society. In order to ensure the maximum possible contribution of NI research to these goals, we draw attention to four key principles—overlap, mosaicism, contingency and entanglement—that have emerged from sex/gender research and that should inform NI research design, analysis and interpretation. We discuss the implications of these principles in the form of constructive guidelines and suggestions for researchers, editors, reviewers and science communicators.
Resumo:
The problem of MPLS networks survivability analysis is considered in this paper. The survivability indexes are defined which take into account the specificity of MPLS networks and the algorithm of its estimation is elaborated. The problem of MPLS network structure optimization under the constraints on the survivability indexes is considered and the algorithm of its solution is suggested. The experimental investigations were carried out and their results are presented.
Resumo:
G-protein coupled receptors (GPCRs) constitute the largest class of membrane proteins and are a major drug target. A serious obstacle to studying GPCR structure/function characteristics is the requirement to extract the receptors from their native environment in the plasma membrane, coupled with the inherent instability of GPCRs in the detergents required for their solubilization. In the present study, we report the first solubilization and purification of a functional GPCR [human adenosine A
Resumo:
* This work was financially supported by RFBF-04-01-00858.
Resumo:
An effective aperture approach is used as a tool for analysis and parameter optimization of mostly known ultrasound imaging systems - phased array systems, compounding systems and synthetic aperture imaging systems. Both characteristics of an imaging system, the effective aperture function and the corresponding two-way radiation pattern, provide information about two of the most important parameters of images produced by an ultrasound system - lateral resolution and contrast. Therefore, in the design, optimization of the effective aperture function leads to optimal choice of such parameters of an imaging systems that influence on lateral resolution and contrast of images produced by this imaging system. It is shown that the effective aperture approach can be used for optimization of a sparse synthetic transmit aperture (STA) imaging system. A new two-stage algorithm is proposed for optimization of both the positions of the transmitted elements and the weights of the receive elements. The proposed system employs a 64-element array with only four active elements used during transmit. The numerical results show that Hamming apodization gives the best compromise between the contrast of images and the lateral resolution.
Resumo:
This chapter provides the theoretical foundation and background on Data Envelopment Analysis (DEA) method and some variants of basic DEA models and applications to various sectors. Some illustrative examples, helpful resources on DEA, including DEA software package, are also presented in this chapter. DEA is useful for measuring relative efficiency for variety of institutions and has its own merits and limitations. This chapter concludes that DEA results should be interpreted with much caution to avoid giving wrong signals and providing inappropriate recommendations.
Resumo:
Mathematics Subject Classification: 26A33, 93C83, 93C85, 68T40
Resumo:
The uncertainty of measurements must be quantified and considered in order to prove conformance with specifications and make other meaningful comparisons based on measurements. While there is a consistent methodology for the evaluation and expression of uncertainty within the metrology community industry frequently uses the alternative Measurement Systems Analysis methodology. This paper sets out to clarify the differences between uncertainty evaluation and MSA and presents a novel hybrid methodology for industrial measurement which enables a correct evaluation of measurement uncertainty while utilising the practical tools of MSA. In particular the use of Gage R&R ANOVA and Attribute Gage studies within a wider uncertainty evaluation framework is described. This enables in-line measurement data to be used to establish repeatability and reproducibility, without time consuming repeatability studies being carried out, while maintaining a complete consideration of all sources of uncertainty and therefore enabling conformance to be proven with a stated level of confidence. Such a rigorous approach to product verification will become increasingly important in the era of the Light Controlled Factory with metrology acting as the driving force to achieve the right first time and highly automated manufacture of high value large scale products such as aircraft, spacecraft and renewable power generation structures.
Resumo:
Heat sinks are widely used for cooling electronic devices and systems. Their thermal performance is usually determined by the material, shape, and size of the heat sink. With the assistance of computational fluid dynamics (CFD) and surrogate-based optimization, heat sinks can be designed and optimized to achieve a high level of performance. In this paper, the design and optimization of a plate-fin-type heat sink cooled by impingement jet is presented. The flow and thermal fields are simulated using the CFD simulation; the thermal resistance of the heat sink is then estimated. A Kriging surrogate model is developed to approximate the objective function (thermal resistance) as a function of design variables. Surrogate-based optimization is implemented by adaptively adding infill points based on an integrated strategy of the minimum value, the maximum mean square error approach, and the expected improvement approaches. The results show the influence of design variables on the thermal resistance and give the optimal heat sink with lowest thermal resistance for given jet impingement conditions.
Resumo:
The representation of serial position in sequences is an important topic in a variety of cognitive areas including the domains of language, memory, and motor control. In the neuropsychological literature, serial position data have often been normalized across different lengths, and an improved procedure for this has recently been reported by Machtynger and Shallice (2009). Effects of length and a U-shaped normalized serial position curve have been criteria for identifying working memory deficits. We present simulations and analyses to illustrate some of the issues that arise when relating serial position data to specific theories. We show that critical distinctions are often difficult to make based on normalized data. We suggest that curves for different lengths are best presented in their raw form and that binomial regression can be used to answer specific questions about the effects of length, position, and linear or nonlinear shape that are critical to making theoretical distinctions. © 2010 Psychology Press.
Resumo:
Background and objective: Safe prescribing requires accurate and practical information about drugs. Our objective was to measure the utility of current sources of prescribing guidance when used to inform practical prescribing decisions, and to compare current sources of prescribing guidance in the UK with idealized prescribing guidance. Methods: We developed 25 clinical scenarios. Two independent assessors rated and ranked the performance of five common sources of prescribing guidance in the UK when used to answer the clinical scenarios. A third adjudicator facilitated review of any disparities. An idealized list of contents for prescribing guidance was developed and sent for comments to academics and users of prescribing guidance. Following consultation an operational check was used to assess compliance with the idealized criteria. The main outcome measures were relative utility in answering the clinical scenarios and compliance with the idealized prescribing guidance. Results: Current sources of prescribing guidance used in the UK differ in their utility, when measured using clinical scenarios. The British National Formulary (BNF) and EMIS LV were the best performing sources in terms of both ranking [mean rank 1·24 and 2·20] and rating [%excellent or adequate 100% and 72%]. Current sources differed in the extent to which they fulfilled criteria for ideal prescribing guidance, but the BNF, and EMIS LV to a lesser extent, closely matched the criteria. Discussion: We have demonstrated how clinical scenarios can be used to assess prescribing guidance resources. Producers of prescribing guidance documents should consider our idealized template. Prescribers require high-quality information to support their practice. Conclusion: Our test was helpful in distinguishing between prescribing resources. Producers of prescribing guidance should consider the utility of their products to end-users, particularly in those more complex areas where prescribers may need most support. Existing UK prescribing guidance resources differ in their ability to provide assistance to prescribers. © 2010 Blackwell Publishing Ltd.
Resumo:
Many management scholars believe that the process used to make strategic decisions affects the quality of those decisions. However several authors have observed a lack of research on the strategic decision making process. Empirical tests of factors that have been hypothesized to affect the way strategic decisions are made notably are absent. (Fredrickson, 1985) This paper reports the results of a study that attempts to assess the effects of decision making circumstances focusing mainly on the approaches applied and the managerial skills and capabilities the decision makers built on during concrete strategic decision making procedures. The study was conducted in California between September 2005 and June 2006 and it was sponsored by a Fulbright Research Scholarship Grant.
Resumo:
The theoretical analysis and research of cultural activities have been limited, for the most part, to the study of the role the public sector plays in the funding and support of nonprofit Arts organizations. The tools used to evaluate this intervention follow a macroeconomic perspective and fail to account for microeconomic principles and assumptions that affect the behavior of these organizations. This dissertation describes through conceptual models the behavior of the agents involved in the artistic process and the economic sectors affected by it. The first paper deals with issues related to economic impact studies and formulates a set of guidelines that should be followed when conducting this type of study. One of the ways to assess more accurately the impact culture has in a community is by assuming that artists can re-create the public space of a blight community and get it ready for a regeneration process. The second paper of this dissertation assumes just that and explains in detail all the cultural, political, economic and sociological interactions that are taking place in the Arts-led regeneration process in Miami Beach, Florida. The paper models the behavior of these agents by indicating what their goals and decision process mechanisms are. The results give support to the claim that the public space artists create in a city actually stimulate development. The third paper discusses the estimation of a demand function for artistic activities, specifically the New World Symphony (NWS) located in Miami Beach, Florida. The behavior of the consumers and producers of NWS' concerts is modeled. The results support the notion that consumers make their decisions based, among other things, on the perceived value these concerts have. Economists engage in the analysis of the effects of cultural activities in a community since many cities rely on them for their development. The history of many communities is not told by their assembly lines and machinery anymore but by their centers of entertainment, hotels and restaurants. Many cities in Europe and North America that have seen the manufacturing sector migrate to the South are trying to face the demands of the new economy by using the Arts as catalysts for development. ^
Resumo:
Accounting students become practitioners facing ethical decision-making challenges that can be subject to various interpretations; hence, the profession is concerned with the appropriateness of their decisions. Moral development of these students has implications for a profession under legal challenges, negative publicity, and government scrutiny. Accounting students' moral development has been studied by examining their responses to moral questions in Rest's Defining Issues Test (DIT), their professional attitudes on Hall's Professionalism Scale Dimensions, and their ethical orientation-based professional commitment and ethical sensitivity. This study extended research in accounting ethics and moral development by examining students in a college where an ethics course is a requirement for graduation. ^ Knowledge of differences in the moral development of accounting students may alert practitioners and educators to potential problems resulting from a lack of ethical understanding as measured by moral development levels. If student moral development levels differ by major, and accounting majors have lower levels than other students, the conclusion may be that this difference is a causative factor for the alleged acts of malfeasance in the profession that may result in malpractice suits. ^ The current study compared 205 accounting, business, and nonbusiness students from a private university. In addition to academic major and completion of an ethics course, the other independent variable was academic level. Gender and age were tested as control variables and Rest's DIT score was the dependent variable. The primary analysis was a 2 x 3 x 3 ANOVA with post hoc tests for results with significant p-value of less than 0.05. ^ The results of this study reveal that students who take an ethics course appear to have a higher level of moral development (p = 0.013), as measured by the (DIT), than students at the same academic level who have not taken an ethics course. In addition, a statistically significant difference (p = 0.034) exists between freshmen who took an ethics class and juniors who did not take an ethics class. For every analysis except one, the lower class year with an ethics class had a higher level of moral development than the higher class year without an ethics class. These results appear to show that ethics education in particular has a greater effect on the level of moral development than education in general. Findings based on the gender specific analyses appear to show that males and females respond differently to the effects of taking an ethics class. The male students do not appear to increase their moral development level after taking an ethics course (p = 0.693) but male levels of moral development differ significantly (p = 0.003) by major. Female levels of moral development appear to increase after taking an ethics course (p = 0.002). However, they do not differ according to major (p = 0.097). ^ These findings indicate that accounting students should be required to have a class in ethics as part of their college curriculum. Students with an ethics class have a significantly higher level of moral development. The challenges facing the profession at the current time indicate that public confidence in the reports of client corporations has eroded and one way to restore this confidence could be to require ethics training of future accountants. ^
Resumo:
Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.