903 resultados para Mathematical problem with complementarity constraints


Relevância:

100.00% 100.00%

Publicador:

Resumo:

While there are many articles in the popular press and practitioner journals concerning the Millennials (i.e., who they are and what we need to do about them), the academic literature on the subject is more limited. This chapter (1) extensively reviews this literature as published in practitioner, popular press, and academic journals across disciplines including psychology, sociology, management, human resources, and accounting education, and (2) surveys the generational study literature to determine what, if any, rigorous empirical studies exist to support (or refute) the existence of a distinct Millennial generational cohort. While the popular press is voluminous when it comes to avowed generational differences between Millennials and their predecessors, there is a paucity of peer-reviewed, academic, empirical work in the area and most of the latter suffers in some way from the overarching problem with generational research: the linear relationship between age, period, and generation that results in these variables being inherently entwined. However, even absent strong empirical evidence of a unique generational cohort, the literature offers extensive suggestions about what to do about the Millennials in our classrooms and work places. This paper better informs accounting faculty about the traits of the current generation of accounting students that are supported by empirical research versus claims made in the popular press. It argues for a more reasoned ‘‘continuous improvement’’ approach to Millennials while offering some classroom suggestions for accounting faculty members.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theoretical studies of the problems of the securities markets in the Russian Federation incline to one or other of the two traditional approaches. The first consists of comparing the definition of "valuable paper" set forth in the current legislation of the Russian Federation, with the theoretical model of "Wertpapiere" elaborated by German scholars more than 90 years ago. The problem with this approach is, in Mr. Pentsov's opinion, that any new features of the definition of "security" that do not coincide with the theoretical model of "Wertpapiere" (such as valuable papers existing in non-material, electronic form) are claimed to be incorrect and removed from the current legislation of the Russian Federation. The second approach works on the basis of the differentiation between the Common Law concept of "security" and the Civil Law concept of "valuable paper". Mr. Pentsov's research, presented in an article written in English, uses both methodological tools and involves, firstly, a historical study of the origin and development of certain legal phenomena (securities) as they evolved in different countries, and secondly, a comparative, synchronic study of equivalent legal phenomena as they exist in different countries today. Employing the first method, Mr. Pentsov divided the historical development of the conception of "valuable paper" in Russia into five major stages. He found that, despite the existence of a relatively wide circulation of valuable papers, especially in the second half of the 19th century, Russian legislation before 1917 (the first stage) did not have a unified definition of valuable paper. The term was used, in both theoretical studies and legislation, but it covered a broad range of financial instruments such as stocks, bonds, government bonds, promissory notes, bills of exchange, etc. During the second stage, also, the legislation of the USSR did not have a unified definition of "valuable paper". After the end of the "new economic policy" (1922 - 1930) the stock exchanges and the securities markets in the USSR, with a very few exceptions, were abolished. And thus during the third stage (up to 1985), the use of valuable papers in practice was reduced to foreign economic relations (bills of exchange, stocks in enterprises outside the USSR) and to state bonds. Not surprisingly, there was still no unified definition of "valuable paper". After the beginning of Gorbachev's perestroika, a securities market began to re-appear in the USSR. However, the successful development of securities markets in the USSR was retarded by the absence of an appropriate regulatory framework. The first effort to improve the situation was the adoption of the Regulations on Valuable Papers, approved by resolution No. 590 of the Council of Ministers of the USSR, dated June 19, 1990. Section 1 of the Regulation contained the first statutory definition of "valuable paper" in the history of Russia. At the very beginning of the period of transition to a market economy, a number of acts contained different definitions of "valuable paper". This diversity clearly undermined the stability of the Russian securities market and did not achieve the goal of protecting the investor. The lack of unified criteria for the consideration of such non-standard financial instruments as "valuable papers" significantly contributed to the appearance of numerous fraudulent "pyramid" schemes that were outside of the regulatory scheme of Russia legislation. The situation was substantially improved by the adoption of the new Civil Code of the Russian Federation. According to Section 1 of Article 142 of the Civil Code, a valuable paper is a document that confirms, in compliance with an established form and mandatory requisites, certain material rights whose realisation or transfer are possible only in the process of its presentation. Finally, the recent Federal law No. 39 - FZ "On the Valuable Papers Market", dated April 22 1996, has also introduced the term "emission valuable papers". According to Article 2 of this Law, an "emission valuable paper" is any valuable paper, including non-documentary, that simultaneously has the following features: it fixes the composition of material and non-material rights that are subject to confirmation, cession and unconditional realisation in compliance with the form and procedure established by this federal law; it is placed by issues; and it has equal amount and time of realisation of rights within the same issue regardless of when the valuable paper was purchased. Thus the introduction of the conception of "emission valuable paper" became the starting point in the Russian federation's legislation for the differentiation between the legal regimes of "commercial papers" and "investment papers" similar to the Common Law approach. Moving now to the synchronic, comparative method of research, Mr. Pentsov notes that there are currently three major conceptions of "security" and, correspondingly, three approaches to its legal definition: the Common Law concept, the continental law concept, and the concept employed by Japanese Law. Mr. Pentsov proceeds to analyse the differences and similarities of all three, concluding that though the concept of "security" in the Common Law system substantially differs from that of "valuable paper" in the Continental Law system, nevertheless the two concepts are developing in similar directions. He predicts that in the foreseeable future the existing differences between these two concepts will become less and less significant. On the basis of his research, Mr. Pentsov arrived at the conclusion that the concept of "security" (and its equivalents) is not a static one. On the contrary, it is in the process of permanent evolution that reflects the introduction of new financial instruments onto the capital markets. He believes that the scope of the statutory definition of "security" plays an extremely important role in the protection of investors. While passing the Securities Act of 1933, the United States Congress determined that the best way to achieve the goal of protecting investors was to define the term "security" in sufficiently broad and general terms so as to include within the definition the many types of instruments that in the commercial world fall within the ordinary concept of "security' and to cover the countless and various devices used by those who seek to use the money of others on the promise of profits. On the other hand, the very limited scope of the current definition of "emission valuable paper" in the Federal Law of the Russian Federation entitled "On the Valuable Papers Market" does not allow the anti-fraud provisions of this law to be implemented in an efficient way. Consequently, there is no basis for the protection of investors. Mr. Pentsov proposes amendments which he believes would enable the Russian markets to become more efficient and attractive for both foreign and domestic investors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A colony of golden hamsters had an ongoing problem with hydrocephalus. In an attempt to clear the colony of the problem, new breeders from another supplier had been purchased. At termination of a behavioral study, the brain was collected from 35 animals (four of which had died with hydrocephalus during the study) and was examined macroscopically and by light microscopy. Although no animals manifested obvious behavioral changes, 31 of 35 (88.6%, 13/15 males and 18/20 females in control and manipulated groups) had hydrocephalus. Twenty-five animals had macroscopically identifiable hydrocephalus, and six had hydrocephalus identified microscopically. Neither teratogenic concentrations of metals nor mycotoxins were detected in tissues or food, and sera from breeders tested negative for antibodies to Sendai virus, reovirus 3, and lymphocytic choriomeningitis virus. Trial matings of breeders expected to produce hydrocephalic offspring resulted in affected offspring, and mating of breeders expected to produce normal offspring resulted in normal or less-affected offspring. Hydrocephalus was confirmed retrospectively in some breeders. Hereditary hydrocephalus appears to be widespread in hamster stocks in Central Europe. Affected animals do not manifest signs of disease and usually die without obvious premonitory signs. Despite severe hydrocephalus, the animals can breed, and animal handlers do not identify motor deficits or abnormal behavioral activity. This entity is unlike the previously described, hereditary hydrocephalus of hamsters that is phenotypically identifiable and usually is lethal before they attain breeding age.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

STUDY DESIGN: Systematic literature review. OBJECTIVE: To evaluate the safety and efficacy of vertebroplasty and kyphoplasty using the data presented in published clinical studies, with respect to patient pain relief, restoration of mobility and vertebral body height, complication rate, and incidence of new adjacent vertebral fractures. SUMMARY OF BACKGROUND DATA: Vertebroplasty and kyphoplasty have been gaining popularity for treating vertebral fractures. Current reviews provide an overview of the procedures but are not comprehensive and tend to rely heavily on personal experience. This article aimed to compile all available data and evaluate the clinical outcome of the 2 procedures. METHODS: This is a systematic review of all the available data presented in peer-reviewed published clinical trials. The methodological quality of included studies was evaluated, and data were collected targeting specific standard measurements. Where possible, a quantitative aggregation of the data was performed. RESULTS: A large proportion of subjects had some pain relief, including 87% with vertebroplasty and 92% with kyphoplasty. Vertebral height restoration was possible using kyphoplasty (average 6.6 degrees ) and for a subset of patients using vertebroplasty (average 6.6 degrees ). Cement leaks occurred for 41% and 9% of treated vertebrae for vertebroplasty and kyphoplasty, respectively. New fractures of adjacent vertebrae occurred for both procedures at rates that are higher than the general osteoporotic population but approximately equivalent to the general osteoporotic population that had a previous vertebral fracture. CONCLUSIONS: The problem with stating definitely that vertebroplasty and kyphoplasty are safe and effective procedures is the lack of comparative, blinded, randomized clinical trials. Standardized evaluative methods should be adopted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recurrent event data are largely characterized by the rate function but smoothing techniques for estimating the rate function have never been rigorously developed or studied in statistical literature. This paper considers the moment and least squares methods for estimating the rate function from recurrent event data. With an independent censoring assumption on the recurrent event process, we study statistical properties of the proposed estimators and propose bootstrap procedures for the bandwidth selection and for the approximation of confidence intervals in the estimation of the occurrence rate function. It is identified that the moment method without resmoothing via a smaller bandwidth will produce curve with nicks occurring at the censoring times, whereas there is no such problem with the least squares method. Furthermore, the asymptotic variance of the least squares estimator is shown to be smaller under regularity conditions. However, in the implementation of the bootstrap procedures, the moment method is computationally more efficient than the least squares method because the former approach uses condensed bootstrap data. The performance of the proposed procedures is studied through Monte Carlo simulations and an epidemiological example on intravenous drug users.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Memory impairments constitute an increasing objective and subjective problem with advancing age. The aim of the present study was to investigate the impact of working memory training on memory performance. The authors trained a sample of 80-year-old adults twice weekly over a time period of 3 months. Participants were tested on 4 different memory measures before, immediately after, and 1 year after training completion. The authors found overall increased memory performance in the experimental group compared to an active control group immediately after training completion. This increase was especially pronounced in visual working memory performance and, to a smaller degree, also in visual episodic memory. No group differences were found 1 year after training completion. The results indicate that even in old?old adults, brain plasticity is strong enough to result in transfer effects, that is, performance increases in tasks that were not trained during the intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extrusion die is used to continuously produce parts with a constant cross section; such as sheets, pipes, tire components and more complex shapes such as window seals. The die is fed by a screw extruder when polymers are used. The extruder melts, mixes and pressures the material by the rotation of either a single or double screw. The polymer can then be continuously forced through the die producing a long part in the shape of the die outlet. The extruded section is then cut to the desired length. Generally, the primary target of a well designed die is to produce a uniform outlet velocity without excessively raising the pressure required to extrude the polymer through the die. Other properties such as temperature uniformity and residence time are also important but are not directly considered in this work. Designing dies for optimal outlet velocity variation using simple analytical equations are feasible for basic die geometries or simple channels. Due to the complexity of die geometry and of polymer material properties design of complex dies by analytical methods is difficult. For complex dies iterative methods must be used to optimize dies. An automated iterative method is desired for die optimization. To automate the design and optimization of an extrusion die two issues must be dealt with. The first is how to generate a new mesh for each iteration. In this work, this is approached by modifying a Parasolid file that describes a CAD part. This file is then used in a commercial meshing software. Skewing the initial mesh to produce a new geometry was also employed as a second option. The second issue is an optimization problem with the presence of noise stemming from variations in the mesh and cumulative truncation errors. In this work a simplex method and a modified trust region method were employed for automated optimization of die geometries. For the trust region a discreet derivative and a BFGS Hessian approximation were used. To deal with the noise in the function the trust region method was modified to automatically adjust the discreet derivative step size and the trust region based on changes in noise and function contour. Generally uniformity of velocity at exit of the extrusion die can be improved by increasing resistance across the die but this is limited by the pressure capabilities of the extruder. In optimization, a penalty factor that increases exponentially from the pressure limit is applied. This penalty can be applied in two different ways; the first only to the designs which exceed the pressure limit, the second to both designs above and below the pressure limit. Both of these methods were tested and compared in this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an ideal world, all instructors of safety and health courses would be masters of course subject matter as well as the theories and practices for effective teaching. In practice, however, most instructors are much stronger in one or the other. This paper provides an example of how some fundamental knowledge from educational experts can be useful for improving a traditional safety course. Is there a problem with the way traditional safety and health (S&H) courses are taught? It is asserted by this author that S&H education, in general, places too much emphasis on acquisition and comprehension of facts at the expense of helping students develop higher-level cognitive abilities. This paper explains the basis for the assertion and reports an experience upgrading a traditional fire protection course to include more assignments involving the higher-level ability known in the education community as synthesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION The omega-3 and omega-6 polyunsaturated fatty acids (PUFAs) are the immediate precursors to a number of important mediators of immunity, inflammation and bone function, with products of omega-6 generally thought to promote inflammation and favour bone resorption. Western diets generally provide a 10 to 20-fold deficit in omega-3 PUFAs compared with omega-6, and this is thought to have contributed to the marked rise in incidence of disorders of modern human societies, such as heart disease, colitis and perhaps osteoporosis. Many of our food production animals, fed on grains rich in omega-6, are also exposed to a dietary deficit in omega-3, with perhaps similar health consequences. Bone fragility due to osteoporotic changes in laying hens is a major economic and welfare problem, with our recent estimates of breakage rates indicating up to 95% of free range hens suffer breaks during lay. METHODS Free range hens housed in full scale commercial systems were provided diets supplemented with omega-3 alpha linolenic acid, and the skeletal benefits were investigated by comparison to standard diets rich in omega-6. RESULTS There was a significant 40-60% reduction in keel bone breakage rate, and a corresponding reduction in breakage severity in the omega-3 supplemented hens. There was significantly greater bone density and bone mineral content, alongside increases in total bone and trabecular volumes. The mechanical properties of the omega-3 supplemented hens were improved, with strength, energy to break and stiffness demonstrating significant increases. Alkaline phosphatase (an osteoblast marker) and tartrate-resistant acid phosphatase (an osteoclast marker) both showed significant increases with the omega-3 diets, indicating enhanced bone turnover. This was corroborated by the significantly lower levels of the mature collagen crosslinks, hydroxylysyl pyridinoline, lysyl pyridinoline and histidinohydroxy-lysinonorleucine, with a corresponding significant shift in the mature:immature crosslink ratio. CONCLUSIONS The improved skeletal health in laying hens corresponds to as many as 68million fewer hens suffering keel fractures in the EU each year. The biomechanical and biochemical evidence suggests that increased bone turnover has enhanced the bone mechanical properties, and that this may suggest potential benefits for human osteoporosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linguistic palaeontology permits the identification of two language families whose linguistic ancestors pose the likeliest candidates for the original domesticators of rice, viz. Hmong-Mien and Austroasiatic. In the 2009 model, the ancient Hmong-Mien was identified as the primary domesticators of Asian rice, and the ancient Austroasiatics as the secondary domesticators. Recent rice genetic research leads to the modification of this model for rice domestication, but falls short of identifying the original locus of rice domestication. At the same time, the precise whereabouts of the Austroasiatic homeland remains disputed. Linguistic evidence unrelated to rice agriculture has been adduced to support a southern homeland for Austroasiatic somewhere within the Bay of Bengal littoral. The implications of new rice genetic research are discussed, the linguistic palaeontological evidence is reassessed, and an enduring problem with the archaeology of rice agriculture is highlighted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Car manufacturers increasingly offer delivery programs for the factory pick-up of new cars. Such a program consists of a broad range of event-marketing activities. In this paper we investigate the problem of scheduling the delivery program activities of one day such that the sum of the customers’ waiting times is minimized. We show how to model this problem as a resource-constrained project scheduling problem with nonregular objective function, and we present a relaxation-based beam-search solution heuristic. The relaxations are solved by exploiting a duality relationship between temporal scheduling and min-cost network flow problems. This approach has been developed in cooperation with a German automaker. The performance of the heuristic has been evaluated based on practical and randomly generated test instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cichlid fish inhabit a diverse range of environments that vary in the spectral content of light available for vision. These differences should result in adaptive selective pressure on the genes involved in visual sensitivity, the opsin genes. This study examines the evidence for differential adaptive molecular evolution in East African cichlid opsin genes due to gross differences in environmental light conditions. First, we characterize the selective regime experienced by cichlid opsin genes using a likelihood ratio test format, comparing likelihood models with different constraints on the relative rates of amino acid substitution, across sites. Second, we compare turbid and clear lineages to determine if there is evidence of differences in relative rates of substitution. Third, we present evidence of functional diversification and its relationship to the photic environment among cichlid opsin genes. We report statistical evidence of positive selection in all cichlid opsin genes, except short wavelength–sensitive 1 and short wavelength–sensitive 2b. In all genes predicted to be under positive selection, except short wavelength–sensitive 2a, we find differences in selective pressure between turbid and clear lineages. Potential spectral tuning sites are variable among all cichlid opsin genes; however, patterns of substitution consistent with photic environment–driven evolution of opsin genes are observed only for short wavelength–sensitive 1 opsin genes. This study identifies a number of promising candidate-tuning sites for future study by site-directed mutagenesis. This work also begins to demonstrate the molecular evolutionary dynamics of cichlid visual sensitivity and its relationship to the photic environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sick Building Syndrome is a prevalent problem with patient complaints similar to typical allergy symptoms. Unlike most household allergens, the Asp f 1 allergen is conceivably ubiquitous in the work environment. This project examined levels of the Asp f 1 allergen in office and non-industrial occupational environments, and studied the bioaerosol and dust reservoirs of Aspergillus fumigatus responsible for those levels. ^ Culturable bioaerosols of total mesophilic fungi were sampled with Andersen N6 impactors. Aggressive airborne and bulk dust samples were concurrently collected and assayed for Asp f 1. Bulk dusts were selectively cultured for A. fumigatus. Samples were collected during both wet and dry climatological conditions to examine the possibility of Asp f 1 increases due to fungal growth blooms. ^ Only very low levels of Asp f 1 were detected in relatively few samples. Analysis of wet versus dry period samples showed no differences in Asp f 1 levels, although A. fumigatus counts from dusts did fluctuate significantly with exterior moisture events as did indoor prevalence of total colony forming units. These results indicate that even in the presence of elevated fungal concentrations, levels of Asp f 1 are extremely low. These levels do not correlate with climatological moisture events, despite distinct fungal blooms in the days immediately following those events. Non-industrial office buildings devoid of indoor air quality issues did not demonstrate significant levels or occurrence of Asp f 1 contamination in the geographical region of this study. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Though IP multicast is resource ef£cient in delivering data to a group of members simultaneously, it suffers from scalability problem with the number of concurrently active multicast groups because it requires a router to keep forwarding state for every multicast tree passing through it. To solve this state scalability problem, we proposed a scheme, called aggregated multicast. The key idea is that multiple groups are forced to share a single delivery tree. In our earlier work, we introduced the basic concept of aggregated multicast and presented some initial results to show that multicast state can be reduced. In this paper, we develop a more quantitative assessment of the cost/bene£t trade-offs. We propose an algorithm to assign multicast groups to delivery trees with controllable cost and introduce metrics to measure multicast state and tree management overhead for multicast schemes. We then compare aggregated multicast with conventional multicast schemes, such as source speci£c tree scheme and shared tree scheme. Our extensive simulations show that aggregated multicast can achieve signi£cant routing state and tree management overhead reduction while containing the expense of extra resources (bandwidth waste and tunnelling overhead). We conclude that aggregated multicast is a very cost-effective and promising direction for scalable transit domain multicast provisioning.