954 resultados para Medicine counterfeiting Organized crime Product protection Analysis Forensic intelligence
Resumo:
OBJECTIVES Facial self-mutilation is rare. It is usually discussed from the psychiatric or psychoanalytic perspectives but has little prominence in general medical literature. Our objective was to describe facial self-mutilation in terms of its comorbidities, and to outline the different types of facial mutilation, as well as the basic approach to the patients with facial self-mutilation. METHODS We undertook a review of all published cases of facial self-mutilation (1960-2011). RESULTS We identified 200 published cases in 123 relevant papers. Four major groups of comorbidities emerged: psychiatric, neurological and hereditary disorders, and a group of patients without identified comorbidities. There were three general patterns of facial self-mutilation: (1) major and definitive mutilation, with the ocular globe as primary target--seen in patients with psychotic disorders; (2) stereotypical mutilation involving the oral cavity and of variable degree of severity, most often seen in patients with hereditary neuropathy or encephalopathy; (3) mild chronic self-mutilation, seen in patients with non-psychotic psychiatric disorders, acquired neurological disorders, and patients without comorbidities. About 20% of patients that mutilated their face also mutilated extra-facial structures. Patients with psychiatric conditions, especially those with psychotic disorders, had significantly higher (p<0.05) rates of permanent facial self-mutilation than others. Most treatment plans were very individually based, but some principles, such as prevention of irreversible loss of function and structure, or development of infection are applicable to all patients with facial self-mutilation. CONCLUSIONS Facial self-mutilation is a potentially severe manifestation of diverse conditions. Several aspects of facial self-mutilation remain to be fully characterised from a clinical perspective.
Resumo:
Vertebral compression fracture is a common medical problem in osteoporotic individuals. The quantitative computed tomography (QCT)-based finite element (FE) method may be used to predict vertebral strength in vivo, but needs to be validated with experimental tests. The aim of this study was to validate a nonlinear anatomy specific QCT-based FE model by using a novel testing setup. Thirty-seven human thoracolumbar vertebral bone slices were prepared by removing cortical endplates and posterior elements. The slices were scanned with QCT and the volumetric bone mineral density (vBMD) was computed with the standard clinical approach. A novel experimental setup was designed to induce a realistic failure in the vertebral slices in vitro. Rotation of the loading plate was allowed by means of a ball joint. To minimize device compliance, the specimen deformation was measured directly on the loading plate with three sensors. A nonlinear FE model was generated from the calibrated QCT images and computed vertebral stiffness and strength were compared to those measured during the experiments. In agreement with clinical observations, most of the vertebrae underwent an anterior wedge-shape fracture. As expected, the FE method predicted both stiffness and strength better than vBMD (R2 improved from 0.27 to 0.49 and from 0.34 to 0.79, respectively). Despite the lack of fitting parameters, the linear regression of the FE prediction for strength was close to the 1:1 relation (slope and intercept close to one (0.86 kN) and to zero (0.72 kN), respectively). In conclusion, a nonlinear FE model was successfully validated through a novel experimental technique for generating wedge-shape fractures in human thoracolumbar vertebrae.
Resumo:
Ischaemic spinal cord injury (SCI) remains the Achilles heel of open and endovascular descending thoracic and thoracoabdominal repair. Neurological outcomes have improved coincidentially with the introduction of neuroprotective measures. However, SCI (paraplegia and paraparesis) remains the most devastating complication. The aim of this position paper is to provide physicians with broad information regarding spinal cord blood supply, to share strategies for shortening intraprocedural spinal cord ischaemia and to increase spinal cord tolerance to transitory ischaemia through detection of ischaemia and augmentation of spinal cord blood perfusion. This study is meant to support physicians caring for patients in need of any kind of thoracic or thoracoabdominal aortic repair in decision-making algorithms in order to understand, prevent or reverse ischaemic SCI. Information has been extracted from focused publications available in the PubMed database, which are cohort studies, experimental research reports, case reports, reviews, short series and meta-analyses. Individual chapters of this position paper were assigned and after delivery harmonized by Christian D. Etz, Ernst Weigang and Martin Czerny. Consequently, further writing assignments were distributed within the group and delivered in August 2014. The final version was submitted to the EJCTS for review in September 2014.
Resumo:
The genetic variability of milk protein genes may influence the nutritive value or processing and functional properties of the milk. While numerous protein variants are known in ruminants, knowledge about milk protein variability in horses is still limited. Mare's milk is, however, produced for human consumption in many countries. Beta-lactoglobulin belonging to the protein family of lipocalins, which are known as common food- and airborne allergens, is a major whey protein. It is absent from human milk and thus a key agent in provoking cow's milk protein allergy. Mare's milk is, however, usually better tolerated by most affected people. Several functions of β-lactoglobulin have been discussed, but its ultimate physiological role remains unclear. In the current study, the open reading frames of the two equine β-lactoglobulin paralogues LGB1 and LGB2 were re-sequenced in 249 horses belonging to 14 different breeds in order to predict the existence of protein variants at the DNA-level. Thereby, only a single signal peptide variant of LGB1, but 10 different putative protein variants of LGB2 were identified. In horses, both genes are expressed and in such this is a striking previously unknown difference in genetic variability between the two genes. It can be assumed that LGB1 is the ancestral paralogue, which has an essential function causing a high selection pressure. As horses have very low milk fat content this unknown function might well be related to vitamin-uptake. Further studies are, however, needed, to elucidate the properties of the different gene products.
Resumo:
Deregulation of kinase activity is one example of how cells become cancerous by evading evolutionary constraints. The Tousled kinase (Tsl) was initially identified in Arabidopsis thaliana as a developmentally important kinase. There are two mammalian orthologues of Tsl and one orthologue in C. elegans, TLK-1, which is essential for embryonic viability and germ cell development. Depletion of TLK-1 leads to embryonic arrest large, distended nuclei, and ultimately embryonic lethality. Prior to terminal arrest, TLK-1-depleted embryos undergo aberrant mitoses characterized by poor metaphase chromosome alignment, delayed mitotic progression, lagging chromosomes, and supernumerary centrosomes. I discovered an unanticipated requirement for TLK-1 in mitotic spindle assembly and positioning. Normally, in the newly-fertilized zygote (P0) the maternal pronucleus migrates toward the paternal pronucleus at the posterior end of the embryo. After pronuclear meeting, the pronuclear-centrosome complex rotates 90° during centration to align on the anteroposterior axis followed by nuclear envelope breakdown (NEBD). However, in TLK-1-depleted P0 embryos, the centrosome-pronuclear complex rotation is significantly delayed with respect to NEBD and chromosome congression, Additionally, centrosome positions over time in tlk-1(RNAi) early embryos revealed a defect in posterior centrosome positioning during spindle-pronuclear centration, and 4D analysis of centrosome positions and movement in newly fertilized embryos showed aberrant centrosome dynamics in TLK-1-depleted embryos. Several mechanisms contribute to spindle rotation, one of which is the anchoring of astral microtubules to the cell cortex. Attachment of these microtubules to the cortices is thought to confer the necessary stability and forces in order to rotate the centrosome-pronuclear complex in a timely fashion. Analysis of a microtubule end-binding protein revealed that TLK-1-depleted embryos exhibit a more stochastic distribution of microtubule growth toward the cell cortices, and the types of microtubule attachments appear to differ from wild-type embryos. Additionally, fewer astral microtubules are in the vicinity of the cell cortex, thus suggesting that the delayed spindle rotation could be in part due to a lack of appropriate microtubule attachments to the cell cortex. Together with recently published biochemical data revealing the Tousled-like kinases associate with components of the dynein microtubule motor complex in humans, these data suggest that Tousled-like kinases play an important role in mitotic spindle assembly and positioning.
Resumo:
The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.
Resumo:
A forensic report is the primary work product of a forensic psychologist. The aim of a forensic report is to inform and influence the court. Unlike a clinical report, a forensic report influences the outcome of a legal conflict. This means that greater care must be taken in writing the report. The following errors (Grisso, 2010) were used to discuss best practices in forensic report writing: failure to answer the referral question, organization problems, language problems, mixed data and interpretation, inclusion of irrelevant data, over-reliance on a single source of data, improper psychological test use, failure to consider alternative hypotheses, and opinions without sufficient explanation. The purpose of this paper is to provide in one place all the information needed to improve forensic report writing, and to help the reader apply the literature using specific examples. Redacted report samples were collected from psychologists, graduate psychology trainees, teaching assistant experience, and clinical work. Identified errors in these samples were then corrected using the recommendations in the literature. Geared toward graduate psychology trainees, each section should serve both as a tutorial and as a brief checklist to help the reader avoid common pitfalls and assist in promoting better forensic report writing.
Resumo:
Introduction. This chapter takes a closer look at the European Union (EU), China, and the Association of Southeast Asian Nations (ASEAN)’s respective approaches to dealing with non-traditional security (NTS) challenges by investigating their policies toward Burma/Myanmar—a source country of numerous such challenges. It argues that, although all, as members of the ASEAN Regional Forum (ARF), see the need for multilateral solutions to fight organized crime, provide disaster relief, combat terrorism, prevent drug trafficking, etc., they differ with respect to the steps to be taken to protect human security in Asia-Pacific. China, initially hesitant to join the ARF for fear that other members might try to contain it, has come to value the principal forum for NTS challenges in the Asia-Pacific region since, like many ASEAN countries, it is a big proponent of non-interventionism, non-use of force, consensus decision-making, that is, the confidence-building mechanisms commonly referred to as the ‘ASEAN way’.2 The EU, as a strong proponent of human rights and the rule of law, repeatedly, has criticized ARF members for allowing sovereignty-related norms to get in the way of the protection of human rights, but it has refrained from assuming the role of norm exporter. As will be seen in the case of Burma/Myanmar, the EU does make its opinions heard and, when necessary, will take unilateral steps not supported by the ASEAN members of the ARF but, cognizant of the history of the region, for the most part, settles for supporting economic development and aiding in capacity-building, understanding that it would be counter-productive to exert pressure on reluctant ARF members to modify the non-interference norm. The chapter then speculates about the ‘ASEAN way’s’ longevity, arguing that, increasingly, there are internal and external dynamics that seem to indicate that the ‘ASEAN way,’ at least in its current form, may not be here to stay. The conclusion looks at what might be in store for Burma/Myanmar in the years to come.
Resumo:
The aim of analogue model experiments in geology is to simulate structures in nature under specific imposed boundary conditions using materials whose rheological properties are similar to those of rocks in nature. In the late 1980s, X-ray computed tomography (CT) was first applied to the analysis of such models. In early studies only a limited number of cross-sectional slices could be recorded because of the time involved in CT data acquisition, the long cooling periods for the X-ray source and computational capacity. Technological improvements presently allow an almost unlimited number of closely spaced serial cross-sections to be acquired and calculated. Computer visualization software allows a full 3D analysis of every recorded stage. Such analyses are especially valuable when trying to understand complex geological structures, commonly with lateral changes in 3D geometry. Periodic acquisition of volumetric data sets in the course of the experiment makes it possible to carry out a 4D analysis of the model, i.e. 3D analysis through time. Examples are shown of 4D analysis of analogue models that tested the influence of lateral rheological changes on the structures obtained in contractional and extensional settings.
Resumo:
"Serial no. 146."
Resumo:
Mode of access: Internet.
Resumo:
The last decade has witnessed a significant growth in transnational organised crime activities. It has also seen multiple efforts by the international community to come to terms with this rise of organised crime and to work towards an international instrument to combat the activities of criminal organisations. In December 2000, the United Nations opened for signature the Convention against Transnational Organized Crime (2001), also known as the Palermo Convention, a treaty that is supplemented by three protocols on trafficking in persons, smuggling of migrants, and trafficking in firearms and ammunition. The conclusion of the Convention marks the end of more than eight years of consultations on a universal instrument to criminalise and counteract transnational criminal organisations. This article illustrates the developments that led to the Convention against Transnational Organized Crime and reflects on the amendments and concessions that have been made to earlier proposals during the elaboration process. This article highlights the strengths of the Convention in the areas of judicial cooperation and mutual legal assistance, and the shortcomings of the new Convention, in particular in failing to establish a universal, unequivocal definition of “transnational organized crime”.
Resumo:
Australian terrestrial elapid snakes contain amongst the most potently toxic venoms known. However, despite the well-documented clinical effects of snake bite, little research has focussed on individual venom components at the molecular level. To further characterise the components of Australian elapid venoms, a complementary (cDNA) microarray was produced from the venom gland of the coastal taipan (Oxyuranus scutellatus) and subsequently screened for venom gland-specific transcripts. A number of putative toxin genes were identified, including neurotoxins, phospholipases, a pseudechetoxin-like gene, a venom natriuretic peptide and a nerve growth factor together with other genes involved in cellular maintenance. Venom gland-specific components also included a calglandulin-like protein implicated in the secretion of toxins from the gland into the venom. These toxin transcripts were subsequently identified in seven other related snake species, producing a detailed comparative analysis at the cDNA and protein levels. This study represents the most detailed description to date of the cloning and characterisation of different genes associated with envenomation from Australian snakes.
Resumo:
Objective: Recently, much research has been proposed using nature inspired algorithms to perform complex machine learning tasks. Ant colony optimization (ACO) is one such algorithm based on swarm intelligence and is derived from a model inspired by the collective foraging behavior of ants. Taking advantage of the ACO in traits such as self-organization and robustness, this paper investigates ant-based algorithms for gene expression data clustering and associative classification. Methods and material: An ant-based clustering (Ant-C) and an ant-based association rule mining (Ant-ARM) algorithms are proposed for gene expression data analysis. The proposed algorithms make use of the natural behavior of ants such as cooperation and adaptation to allow for a flexible robust search for a good candidate solution. Results: Ant-C has been tested on the three datasets selected from the Stanford Genomic Resource Database and achieved relatively high accuracy compared to other classical clustering methods. Ant-ARM has been tested on the acute lymphoblastic leukemia (ALL)/acute myeloid leukemia (AML) dataset and generated about 30 classification rules with high accuracy. Conclusions: Ant-C can generate optimal number of clusters without incorporating any other algorithms such as K-means or agglomerative hierarchical clustering. For associative classification, while a few of the well-known algorithms such as Apriori, FP-growth and Magnum Opus are unable to mine any association rules from the ALL/AML dataset within a reasonable period of time, Ant-ARM is able to extract associative classification rules.