901 resultados para Nontrivial critical point of a polynomial
Resumo:
We present a comprehensive experimental and theoretical investigation of the thermodynamic properties: specific heat, magnetization, and thermal expansion in the vicinity of the field-induced quantum critical point (QCP) around the lower critical field H-c1 approximate to 2 T in NiCl2-4SC(NH2)(2). A T-3/2 behavior in the specific heat and magnetization is observed at very low temperatures at H = H-c1, which is consistent with the universality class of Bose-Einstein condensation of magnons. The temperature dependence of the thermal expansion coefficient at H-c1 shows minor deviations from the expected T-1/2 behavior. Our experimental study is complemented by analytical calculations and quantum Monte Carlo simulations, which reproduce nicely the measured quantities. We analyze the thermal and the magnetic Gruneisen parameters, which are ideal quantities to identify QCPs. Both parameters diverge at H-c1 with the expected T-1 power law. By using the Ehrenfest relations at the second-order phase transition, we are able to estimate the pressure dependencies of the characteristic temperature and field scales.
Resumo:
This letter presents a new recursive method for computing discrete polynomial transforms. The method is shown for forward and inverse transforms of the Hermite, binomial, and Laguerre transforms. The recursive flow diagrams require only 2 additions, 2( +1) memory units, and +1multipliers for the +1-point Hermite and binomial transforms. The recursive flow diagram for the +1-point Laguerre transform requires 2 additions, 2( +1) memory units, and 2( +1) multipliers. The transform computation time for all of these transforms is ( )
Resumo:
In this dissertation, the National Survey of Student Engagement (NSSE) serves as a nodal point through which to examine the power relations shaping the direction and practices of higher education in the twenty-first century. Theoretically, my analysis is informed by Foucault’s concept of governmentality, briefly defined as a technology of power that influences or shapes behavior from a distance. This form of governance operates through apparatuses of security, which include higher education. Foucault identified three essential characteristics of an apparatus—the market, the milieu, and the processes of normalization—through which administrative mechanisms and practices operate and govern populations. In this project, my primary focus is on the governance of faculty and administrators, as a population, at residential colleges and universities. I argue that the existing milieu of accountability is one dominated by the neoliberal assumption that all activity—including higher education—works best when governed by market forces alone, reducing higher education to a market-mediated private good. Under these conditions, what many in the academy believe is an essential purpose of higher education—to educate students broadly, to contribute knowledge for the public good, and to serve as society’s critic and social conscience (Washburn 227)—is being eroded. Although NSSE emerged as a form of resistance to commercial college rankings, it did not challenge the forces that empowered the rankings in the first place. Indeed, NSSE data are now being used to make institutions even more responsive to market forces. Furthermore, NSSE’s use has a normalizing effect that tends to homogenize classroom practices and erode the autonomy of faculty in the educational process. It also positions students as part of the system of surveillance. In the end, if aspects of higher education that are essential to maintaining a civil society are left to be defined solely in market terms, the result may be a less vibrant and, ultimately, a less just society.
Resumo:
We explore a method developed in statistical physics which has been argued to have exponentially small finite-volume effects, in order to determine the critical temperature Tc of pure SU(3) gauge theory close to the continuum limit. The method allows us to estimate the critical coupling βc of the Wilson action for temporal extents up to Nτ∼20 with ≲0.1% uncertainties. Making use of the scale setting parameters r0 and t0−−√ in the same range of β-values, these results lead to the independent continuum extrapolations Tcr0=0.7457(45) and Tct0−−√=0.2489(14), with the latter originating from a more convincing fit. Inserting a conversion of r0 from literature (unfortunately with much larger errors) yields Tc/ΛMS¯¯¯¯¯=1.24(10).
Resumo:
BACKGROUND Implementation of user-friendly, real-time, electronic medical records for patient management may lead to improved adherence to clinical guidelines and improved quality of patient care. We detail the systematic, iterative process that implementation partners, Lighthouse clinic and Baobab Health Trust, employed to develop and implement a point-of-care electronic medical records system in an integrated, public clinic in Malawi that serves HIV-infected and tuberculosis (TB) patients. METHODS Baobab Health Trust, the system developers, conducted a series of technical and clinical meetings with Lighthouse and Ministry of Health to determine specifications. Multiple pre-testing sessions assessed patient flow, question clarity, information sequencing, and verified compliance to national guidelines. Final components of the TB/HIV electronic medical records system include: patient demographics; anthropometric measurements; laboratory samples and results; HIV testing; WHO clinical staging; TB diagnosis; family planning; clinical review; and drug dispensing. RESULTS Our experience suggests that an electronic medical records system can improve patient management, enhance integration of TB/HIV services, and improve provider decision-making. However, despite sufficient funding and motivation, several challenges delayed system launch including: expansion of system components to include of HIV testing and counseling services; changes in the national antiretroviral treatment guidelines that required system revision; and low confidence to use the system among new healthcare workers. To ensure a more robust and agile system that met all stakeholder and user needs, our electronic medical records launch was delayed more than a year. Open communication with stakeholders, careful consideration of ongoing provider input, and a well-functioning, backup, paper-based TB registry helped ensure successful implementation and sustainability of the system. Additional, on-site, technical support provided reassurance and swift problem-solving during the extended launch period. CONCLUSION Even when system users are closely involved in the design and development of an electronic medical record system, it is critical to allow sufficient time for software development, solicitation of detailed feedback from both users and stakeholders, and iterative system revisions to successfully transition from paper to point-of-care electronic medical records. For those in low-resource settings, electronic medical records for integrated care is a possible and positive innovation.
Resumo:
Precise and reproducible surface nanopatterning is the key for a successful ordered growth of GaN nanocolumns. In this work, we point out the main technological issues related to the patterning process, mainly surface roughness and cleaning, and mask adhesion to the substrate. We found that each of these factors, process-related, has a dramatic impact on the subsequent selective growth of the columns inside the patterned holes. We compare the performance of e-beam lithography, colloidal lithography, and focused ion beam in the fabrication of hole-patterned masks for ordered columnar growth. These results are applicable to the ordered growth of nanocolumns of different materials.
Resumo:
La innovación en Sistemas Intesivos en Software está alcanzando relevancia por múltiples razones: el software está presente en sectores como automóvil, teléfonos móviles o salud. Las empresas necesitan conocer aquellos factores que afectan a la innovación para incrementar las probabilidades de éxito en el desarrollo de sus productos y, la evaluación de productos sofware es un mecanismo potente para capturar este conocimiento. En consecuencia, las empresas necesitan evaluar sus productos desde la perpectiva de innovación para reducir la distancia entre los productos desarrollados y el mercado. Esto es incluso más relevante en el caso de los productos intensivos en software, donde el tiempo real, la oportunidad, complejidad, interoperabilidad, capacidad de respuesta y compartción de recursos son características críticas de los nuevos sistemas. La evaluación de la innovación de productos ya ha sido estudiada y se han definido algunos esquemas de evaluación pero no son específicos para Sistemas intensivos en Sofwtare; además, no se ha alcanzado consenso en los factores ni el procedimiento de evaluación. Por lo tanto, tiene sentido trabajar en la definición de un marco de evaluación de innovación enfocado a Sistemas intesivos en Software. Esta tesis identifica los elementos necesarios para construir in marco para la evaluación de de Sistemas intensivos en Software desde el punto de vista de la innovación. Se han identificado dos componentes como partes del marco de evaluación: un modelo de referencia y una herramienta adaptativa y personalizable para la realización de la evaluación y posicionamiento de la innovación. El modelo de referencia está compuesto por cuatro elementos principales que caracterizan la evaluación de innovación de productos: los conceptos, modelos de innovación, cuestionarios de evaluación y la evaluación de productos. El modelo de referencia aporta las bases para definir instancias de los modelos de evaluación de innovación de productos que pueden se evaluados y posicionados en la herramienta a través de cuestionarios y que de forma automatizada aporta los resultados de la evaluación y el posicionamiento respecto a la innovación de producto. El modelo de referencia ha sido rigurosamente construido aplicando modelado conceptual e integración de vistas junto con la aplicación de métodos cualitativos de investigación. La herramienta ha sido utilizada para evaluar productos como Skype a través de la instanciación del modelo de referencia. ABSTRACT Innovation in Software intensive Systems is becoming relevant for several reasons: software is present embedded in many sectors like automotive, robotics, mobile phones or heath care. Firms need to have knowledge about factors affecting the innovation to increase the probability of success in their product development and the assessment of innovation in software products is a powerful mechanism to capture this knowledge. Therefore, companies need to assess products from an innovation perspective to reduce the gap between their developed products and the market. This is even more relevant in the case of SiSs, where real time, timeliness, complexity, interoperability, reactivity, and resource sharing are critical features of a new system. Many authors have analysed product innovation assessment and some schemas have been developed but they are not specific to SiSs; in addition, there is no consensus about the factors or the procedures for performing an assessment. Therefore, it has sense to work in the definition of a customized software product innovation evaluation framework. This thesis identifies the elements needed to build a framework to assess software products from the innovation perspective. Two components have been identified as part of the framework to assess Software intensive Systems from the innovation perspective: a reference-model and an adaptive and customizable tool to perform the assessment and to position product innovation. The reference-model is composed by four main elements characterizing product innovation assessment: concepts, innovation models, assessment questionnaires and product assessment. The reference model provides the umbrella to define instances of product innovation assessment models that can be assessed and positioned through questionnaires in the proposed tool that also provides automation in the assessment and positioning of innovation. The reference-model has been rigorously built by applying conceptual modelling and view integration integrated with qualitative research methods. The tool has been used to assess products like Skype through models instantiated from the reference-model.
Resumo:
We show numeric evidence that, at low enough temperatures, the potential energy density of a glass-forming liquid fluctuates over length scales much larger than the interaction range. We focus on the behavior of translationally invariant quantities. The growing correlation length is unveiled by studying the finite-size effects. In the thermodynamic limit, the specific heat and the relaxation time diverge as a power law. Both features point towards the existence of a critical point in the metastable supercooled liquid phase.
Resumo:
This thesis considers the main theoretical positions within the contemporary sociology of nationalism. These can be grouped into two basic types, primordialist theories which assert that nationalism is an inevitable aspect of all human societies, and modernist theories which assert that nationalism and the nation-state first developed within western Europe in recent centuries. With respect to primordialist approaches to nationalism, it is argued that the main common explanation offered is human biological propensity. Consideration is concentrated on the most recent and plausible of such theories, sociobiology. Sociobiological accounts root nationalism and racism in genetic programming which favours close kin, or rather to the redirection of this programming in complex societies, where the social group is not a kin group. It is argued that the stated assumptions of the sociobiologists do not entail the conclusions they draw as to the roots of nationalism, and that in order to arrive at such conclusions further and implausible assumptions have to be made. With respect to modernists, the first group of writers who are considered are those, represented by Carlton Hayes, Hans Kohn and Elie Kedourie, whose main thesis is that the nation-state and nationalism are recent phenomena. Next, the two major attempts to relate nationalism and the nation-state to imperatives specific either to capitalist societies (in the `orthodox' marxist theory elaborated about the turn of the twentieth century) or to the processes of modernisation and industrialisation (the `Weberian' account of Ernest Gellner) are discussed. It is argued that modernist accounts can only be sustained by starting from a definition of nationalism and the nation-state which conflates such phenomena with others which are specific to the modern world. The marxist and Gellner accounts form the necessary starting point for any explanation as to why the nation-state is apparently the sole viable form of polity in the modern world, but their assumption that no pre-modern society was national leaves them without an adequate account of the earliest origins of the nation-state and of nationalism. Finally, a case study from the history of England argues both the achievement of a national state form and the elucidation of crucial components of a nationalist ideology were attained at a period not consistent with any of the versions of the modernist thesis.
Resumo:
∗ Partially supported by INTAS grant 97-1644
Resumo:
In this work we give su±cient conditions for k-th approximations of the polynomial roots of f(x) when the Maehly{Aberth{Ehrlich, Werner-Borsch-Supan, Tanabe, Improved Borsch-Supan iteration methods fail on the next step. For these methods all non-attractive sets are found. This is a subsequent improvement of previously developed techniques and known facts. The users of these methods can use the results presented here for software implementation in Distributed Applications and Simulation Environ- ments. Numerical examples with graphics are shown.
Resumo:
This paper is dedicated to Prof. Nikolay Kyurkchiev on the occasion of his 70th anniversary This paper gives sufficient conditions for kth approximations of the zeros of polynomial f (x) under which Kyurkchiev’s method fails on the next step. The research is linked with an attack on the global convergence hypothesis of this commonly used in practice method (as correlate hypothesis for Weierstrass–Dochev’s method). Graphical examples are presented.
Resumo:
Motivation: In any macromolecular polyprotic system - for example protein, DNA or RNA - the isoelectric point - commonly referred to as the pI - can be defined as the point of singularity in a titration curve, corresponding to the solution pH value at which the net overall surface charge - and thus the electrophoretic mobility - of the ampholyte sums to zero. Different modern analytical biochemistry and proteomics methods depend on the isoelectric point as a principal feature for protein and peptide characterization. Protein separation by isoelectric point is a critical part of 2-D gel electrophoresis, a key precursor of proteomics, where discrete spots can be digested in-gel, and proteins subsequently identified by analytical mass spectrometry. Peptide fractionation according to their pI is also widely used in current proteomics sample preparation procedures previous to the LC-MS/MS analysis. Therefore accurate theoretical prediction of pI would expedite such analysis. While such pI calculation is widely used, it remains largely untested, motivating our efforts to benchmark pI prediction methods. Results: Using data from the database PIP-DB and one publically available dataset as our reference gold standard, we have undertaken the benchmarking of pI calculation methods. We find that methods vary in their accuracy and are highly sensitive to the choice of basis set. The machine-learning algorithms, especially the SVM-based algorithm, showed a superior performance when studying peptide mixtures. In general, learning-based pI prediction methods (such as Cofactor, SVM and Branca) require a large training dataset and their resulting performance will strongly depend of the quality of that data. In contrast with Iterative methods, machine-learning algorithms have the advantage of being able to add new features to improve the accuracy of prediction. Contact: yperez@ebi.ac.uk Availability and Implementation: The software and data are freely available at https://github.com/ypriverol/pIR. Supplementary information: Supplementary data are available at Bioinformatics online.
Resumo:
A lean termelési rendszer munkásokra gyakorolt hatásaival foglalkozó irodalomban nincsen egyetértés annak megítélésében, hogy a hatásokban a negatív vagy pozitív hatások dominálnak-e. A szerző tanulmánya ehhez a vitához a pszichológiai, egészségügyi, munkahelyi jellemzőkre és a dolgozói elégedettségre vonatkozó eredmények áttekintésével kapcsolódik. A munkások elégedettségének vizsgálata arra utal, hogy a lean termelési rendszer egyszerre növeli és csökkenti is az elégedettséget, így az összességében nem változik más termelési rendszerekhez képest. A lean termelés kritikusai azt hangsúlyozzák, hogy a többi tényező negatívan hat a munkásokra. Megállapításaik megalapozottsága a nagyon kevés empirikus munka miatt megkérdőjelezhető. Ugyanakkor a tevékenységmenedzsment kutatói érdemben nem tudják cáfolni a stressz, a sérülések és betegségek kockázatának növekedését és a munka intenzívebbé válását. A negatív hatások és a várt pozitív hatások hiányának kiemelése felveti, hogy a munkavállalók bevonásán alapuló lean termelési rendszer nehezen ültethető át a gyakorlatba, illetve hogy a lean termelés intenzifikáción alapuló modellje is elterjedt. _____________ This literature review contributes to the debate related to the effects of lean production on workers. The study reviews different dimensions of the debate and focuses on issues like worker’s satisfaction, psychological effects, health and safety aspects, and workplace characteristics. Findings of researches reviewed in this paper cannot confirm that from workers’ point of view lean production is better than other production initiatives. Lean production enhances and decreases worker’s satisfaction at the same time, altogether, the satisfaction of workers does not change significantly compared to other systems. The negative impact of the other factors (psychological etc.) on workers is usually emphasized in the critique of lean production. Although, the limited number of (empirical) studies doubts these critical voices. However, Operations Management can not reject negative effects like increasing level of stress, increased risks of health and safety problems or intensification of work. The emphasis of the negative effects and the lack of positive effects can refer to the difficult employment of lean involvement system, or simply reflect that the model of lean intensification system is widely spread.
Resumo:
A szerző kutatási célja, hogy megvizsgálja, vajon a hazai vállalatok mai gyakorlatában a fejlettebb logisztikai képességek együtt járnak-e magasabb teljesítménnyel. Az elemzés a Budapesti Corvinus Egyetem Vállalat-gazdaságtan Intézete mellett működő Versenyképesség Kutató Központ vállalati versenyképességet vizsgáló kérdőíves felmérés immár 4. fordulójának adatbázisát használja, melyet 2009 folyamán összesen 300 vállalat vezetőinek megkérdezésével hoztak létre. Ennek az adatbázisnak a felhasználásával a cikk két kérdésre keresi a választ. A szerző egyrészt megvizsgálja, vajon a mintában kimutatható-e egy fejlettebb logisztikai képességekkel rendelkező vállalatcsoport. Másrészt arra is kíváncsi, vajon ezek a fejlettebb logisztikai képességek együtt járnak-e a vállalatok magasabb teljesítményével. A 2009-es adatok elemzése igazolta, hogy logisztikai képességekben két szignifikánsan eltérő fejlettségű vállalatcsoport létét. Az eredményekből az is kiolvasható, hogy a magasabb logisztikai képességek a mintában együtt jártak a magasabb működési teljesítménnyel is. _________________ The overall objective of the paper is to investigate the relationship between logistics capabilities and company’s performance. Empirical analysis is carried out based on an extensive questionnaire developed and conducted at the Competitive Research Center at the Corvinus University of Budapest in 2009. Using this database the author had two separate research questions: Can she detect companies in their sample with significantly different logistics capabilities. If yes, does more developed logistics lead to higher level performance at the company level? After carrying out their empirical analysis they could state, that they could prove the existence of two clusters, one with less and another with more developed logistics capabilities. They also were able to show that firms with more developed logistics can achieve significantly higher operational performance and can be characterized with a more developed customer orientation.