77 resultados para burrow counting


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

(241)Pu was determined in slurry samples from a nuclear reactor decommissioning project at the Paul Scherrer Institute (Switzerland). To validate the results, the (241)Pu activities of five samples were determined by LSC (TriCarb and Quantulus) and ICP-MS, with each instrument at a different laboratory. In lack of certified reference materials for (241)Pu, the methods were further validated using the (241)Pu information values of two reference sediments (IAEA-300 and IAEA-384). Excellent agreement with the results was found between LSC and ICP-MS in the nuclear waste slurries and the reference sediments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Combinatorial optimization involves finding an optimal solution in a finite set of options; many everyday life problems are of this kind. However, the number of options grows exponentially with the size of the problem, such that an exhaustive search for the best solution is practically infeasible beyond a certain problem size. When efficient algorithms are not available, a practical approach to obtain an approximate solution to the problem at hand, is to start with an educated guess and gradually refine it until we have a good-enough solution. Roughly speaking, this is how local search heuristics work. These stochastic algorithms navigate the problem search space by iteratively turning the current solution into new candidate solutions, guiding the search towards better solutions. The search performance, therefore, depends on structural aspects of the search space, which in turn depend on the move operator being used to modify solutions. A common way to characterize the search space of a problem is through the study of its fitness landscape, a mathematical object comprising the space of all possible solutions, their value with respect to the optimization objective, and a relationship of neighborhood defined by the move operator. The landscape metaphor is used to explain the search dynamics as a sort of potential function. The concept is indeed similar to that of potential energy surfaces in physical chemistry. Borrowing ideas from that field, we propose to extend to combinatorial landscapes the notion of the inherent network formed by energy minima in energy landscapes. In our case, energy minima are the local optima of the combinatorial problem, and we explore several definitions for the network edges. At first, we perform an exhaustive sampling of local optima basins of attraction, and define weighted transitions between basins by accounting for all the possible ways of crossing the basins frontier via one random move. Then, we reduce the computational burden by only counting the chances of escaping a given basin via random kick moves that start at the local optimum. Finally, we approximate network edges from the search trajectory of simple search heuristics, mining the frequency and inter-arrival time with which the heuristic visits local optima. Through these methodologies, we build a weighted directed graph that provides a synthetic view of the whole landscape, and that we can characterize using the tools of complex networks science. We argue that the network characterization can advance our understanding of the structural and dynamical properties of hard combinatorial landscapes. We apply our approach to prototypical problems such as the Quadratic Assignment Problem, the NK model of rugged landscapes, and the Permutation Flow-shop Scheduling Problem. We show that some network metrics can differentiate problem classes, correlate with problem non-linearity, and predict problem hardness as measured from the performances of trajectory-based local search heuristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prospective comparative evaluation of patent V blue, fluorescein and (99m)TC-nanocolloids for intraoperative sentinel lymph node (SLN) mapping during surgery for non-small cell lung cancer (NSCLC). Ten patients with peripherally localised clinical stage I NSCLC underwent thoracotomy and peritumoral subpleural injection of 2 ml of patent V blue dye, 1 ml of 10% fluorescein and 1ml of (99m)Tc-nanocolloids (0.4 mCi). The migration and spatial distribution pattern of the tracers was assessed by direct visualisation (patent V blue), visualisation of fluorescence signalling by a lamp of Wood (fluorescein) and radioactivity counting with a hand held gamma-probe ((99m)Tc-nanocolloids). Lymph nodes at interlobar (ATS 11), hilar (ATS 10) and mediastinal (right ATS 2,4,7; left ATS 5,6,7) levels were systematically assessed every 10 min up to 60 min after injection, followed by lobectomy and formal lymph node dissection. Successful migration from the peritumoral area to the mediastinum was observed for all three tracers up to 60 min after injection. The interlobar lympho-fatty tissue (station ATS 11) revealed an early and preferential accumulation of all three tracers for all tumours assessed and irrespective of the tumour localisation. However, no preferential accumulation in one or two distinct lymph nodes was observed up to 60 min after injection for all three tracers assessed. Intraoperative SLN mapping revealed successful migration of the tracers from the site of peritumoral injection to the mediastinum, but in a diffuse pattern without preferential accumulation in sentinel lymph nodes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purified, [131I]-labeled goat antibodies against carcinoembryonic antigen, which have been shown to localize in human carcinoma in nude mice, were injected into 27 patients with carcinoma. Patients were scanned with a scintillation camera at various intervals. In 11 patients, radioactivity was detectable in the tumor 48 hours after injection. Computerized subtraction of blood-pool radioactivity provided clearer pictures in positive cases, but in 16 patients the scans remained doubtful or negative. To study the specificity of [131I]-antibody localization, we gave some patients simultaneous injections of [125I]-labeled normal IgG. Both isotopes were measured by means of scintillation counting in tumors and normal tissues recovered after surgery. The results demonstrated that only the anti-CEA antibodies localized in tumors. However, the total antibody-derived radioactivity in the tumor was only about 0.001 of the injected dose. We conclude that, despite the present demonstration of specificity, this method of tumor detection is not yet clinically useful.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUME Les fibres textiles sont des produits de masse utilisés dans la fabrication de nombreux objets de notre quotidien. Le transfert de fibres lors d'une action délictueuse est dès lors extrêmement courant. Du fait de leur omniprésence dans notre environnement, il est capital que l'expert forensique évalue la valeur de l'indice fibres. L'interprétation de l'indice fibres passe par la connaissance d'un certain nombre de paramètres, comme la rareté des fibres, la probabilité de leur présence par hasard sur un certain support, ainsi que les mécanismes de transfert et de persistance des fibres. Les lacunes les plus importantes concernent les mécanismes de transfert des fibres. A ce jour, les nombreux auteurs qui se sont penchés sur le transfert de fibres ne sont pas parvenus à créer un modèle permettant de prédire le nombre de fibres que l'on s'attend à retrouver dans des circonstances de contact données, en fonction des différents paramètres caractérisant ce contact et les textiles mis en jeu. Le but principal de cette recherche est de démontrer que la création d'un modèle prédictif du nombre de fibres transférées lors d'un contact donné est possible. Dans le cadre de ce travail, le cas particulier du transfert de fibres d'un tricot en laine ou en acrylique d'un conducteur vers le dossier du siège de son véhicule a été étudié. Plusieurs caractéristiques des textiles mis en jeu lors de ces expériences ont été mesurées. Des outils statistiques (régression linéaire multiple) ont ensuite été utilisés sur ces données afin d'évaluer l'influence des caractéristiques des textiles donneurs sur le nombre de fibres transférées et d'élaborer un modèle permettant de prédire le nombre de fibres qui vont être transférées à l'aide des caractéristiques influençant significativement le transfert. Afin de faciliter la recherche et le comptage des fibres transférées lors des expériences de transfert, un appareil de recherche automatique des fibres (liber finder) a été utilisé dans le cadre de cette recherche. Les tests d'évaluation de l'efficacité de cet appareil pour la recherche de fibres montrent que la recherche automatique est globalement aussi efficace qu'une recherche visuelle pour les fibres fortement colorées. Par contre la recherche automatique perd de son efficacité pour les fibres très pâles ou très foncées. Une des caractéristiques des textiles donneurs à étudier est la longueur des fibres. Afin de pouvoir évaluer ce paramètre, une séquence d'algorithmes de traitement d'image a été implémentée. Cet outil permet la mesure de la longueur d'une fibre à partir de son image numérique à haute résolution (2'540 dpi). Les tests effectués montrent que les mesures ainsi obtenues présentent une erreur de l'ordre du dixième de millimètre, ce qui est largement suffisant pour son utilisation dans le cadre de cette recherche. Les résultats obtenus suite au traitement statistique des résultats des expériences de transfert ont permis d'aboutir à une modélisation du phénomène du transfert. Deux paramètres sont retenus dans le modèle: l'état de la surface du tissu donneur et la longueur des fibres composant le tissu donneur. L'état de la surface du tissu est un paramètre tenant compte de la quantité de fibres qui se sont détachées de la structure du tissu ou qui sont encore faiblement rattachées à celle-ci. En effet, ces fibres sont les premières à se transférer lors d'un contact, et plus la quantité de ces fibres par unité de surface est importante, plus le nombre de fibres transférées sera élevé. La longueur des fibres du tissu donneur est également un paramètre important : plus les fibres sont longues, mieux elles sont retenues dans la structure du tissu et moins elles se transféreront. SUMMARY Fibres are mass products used to produce numerous objects encountered everyday. The transfer of fibres during a criminal action is then very common. Because fibres are omnipresent in our environment, the forensic expert has to evaluate the value of the fibre evidence. To interpret fibre evidence, the expert has to know some parameters as frequency of fibres,' probability of finding extraneous fibres by chance on a given support, and transfer and persistence mechanisms. Fibre transfer is one of the most complex parameter. Many authors studied fibre transfer mechanisms but no model has been created to predict the number of fibres transferred expected in a given type of contact according to parameters as characteristics of the contact and characteristics of textiles. The main purpose of this research is to demonstrate that it is possible to create a model to predict the number of fibres transferred during a contact. In this work, the particular case of the transfer of fibres from a knitted textile in wool or in acrylic of a driver to the back of a carseat has been studied. Several characteristics of the textiles used for the experiments were measured. The data obtained were then treated with statistical tools (multiple linear regression) to evaluate the influence of the donor textile characteristics on the number of úbers transferred, and to create a model to predict this number of fibres transferred by an equation containing the characteristics having a significant influence on the transfer. To make easier the searching and the counting of fibres, an apparatus of automatic search. of fibers (fiber finder) was used. The tests realised to evaluate the efficiency of the fiber finder shows that the results obtained are generally as efficient as for visual search for well-coloured fibres. However, the efficiency of automatic search decreases for pales and dark fibres. One characteristic of the donor textile studied was the length of the fibres. To measure this parameter, a sequence of image processing algorithms was implemented. This tool allows to measure the length of a fibre from it high-resolution (2'540 dpi) numerical image. The tests done shows that the error of the measures obtained are about some tenths of millimetres. This precision is sufficient for this research. The statistical methods applied on the transfer experiment data allow to create a model of the transfer phenomenon. Two parameters are included in the model: the shedding capacity of the donor textile surface and the length of donor textile fibres. The shedding capacity of the donor textile surface is a parameter estimating the quantity of fibres that are not or slightly attached to the structure of the textile. These fibres are easily transferred during a contact, and the more this quantity of fibres is high, the more the number of fibres transferred during the contact is important. The length of fibres is also an important parameter: the more the fibres are long, the more they are attached in the structure of the textile and the less they are transferred during the contact.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: The pharmaceutical aspects of drug administration in clinical trials receive poor consideration compared with the important attention devoted to the analytical and mathematical aspects of biological sample exploitation. During PK calculations, many researchers merely use for dose the nominal amount declared, overlooking the noticeable biases that may result in the assessment of PK parameters. The aim of this work was to evaluate the biases related to doses injected of a biosimilar drug in 2 Phase I clinical trials. Patients (or Materials) and Methods: In trial A, 12 healthy volunteers received different doses of a biosimilar of interferon beta-1a by either subcutaneous (SC) or intravenous (IV) injection. The doses were prepared by partially emptying 0.5-mL syringes supplied by the manufacturer (drop count procedure). In trial B, 12 healthy volunteers received 3 different formulations of the drug by IV injection (biosimilar without albumin [HSA], biosimilar with HSA and original brand [Rebif®]) and 2 different formulations as multiple SC injections (biosimilar HSA-free and original brand). In both trials, the actual dose administered was calculated as: D = C·V - losses. The product titer C was assessed by ELISA. The volume administered IV was assessed by weighting. Losses were evaluated by in vitro experiments. Finally, the binding of 125I-interferon to HSA was evaluated by counting the free and HSA complexed molecule fractions separated by gel filtration. Results: Interferon was not significantly adsorbed onto the lines used for its IV administration. In trial A, the titer was very close to the one declared (96 ± 7%). In trial B, it differed significantly (156 ± 10% for biosimilar with/without HSA and 123 ± 5% for original formulation). In trial A, the dose actually administered showed a large variability. The real injected volume could be biased up to 75% compared with the theoretical volume (for the lower dose administered [ie, 0.03 mL]). This was mainly attributed to a partial re-aspiration of the drug solution before withdrawing the syringe needle. A strict procedure was therefore applied in trial B to avoid these inaccuracies. Finally, in trial B, 125I-Interferon beta-1a binding to HSA appeared time dependent and slow, reaching 50% after 16-hour incubation, which is close to steady state reported for the comparator Rebif®. Conclusion: These practical examples (especially biases on actual titer and volume injected) illustrate that actual dose assessment deserves attention to ensure accuracy for estimates of clearance and distribution volume in the scientific literature and for registration purposes, especially for bioequivalence studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: Transferrin (Tf) expression is enhanced by aging and inflammation in humans. We investigated the role of transferrin in glial protection. METHODS: We generated transgenic mice (Tg) carrying the complete human transferrin gene on a C57Bl/6J genetic background. We studied human (hTf) and mouse (mTf) transferrin localization in Tg and wild-type (WT) C57Bl/6J mice using immunochemistry with specific antibodies. Müller glial (MG) cells were cultured from explants and characterized using cellular retinaldehyde binding protein (CRALBP) and vimentin antibodies. They were further subcultured for study. We incubated cells with FeCl(3)-nitrilotriacetate to test for the iron-induced stress response; viability was determined by direct counting and measurement of lactate dehydrogenase (LDH) activity. Tf expression was determined by reverse transcriptase-quantitative PCR with human- or mouse-specific probes. hTf and mTf in the medium were assayed by ELISA or radioimmunoassay (RIA), respectively. RESULTS: mTf was mainly localized in retinal pigment epithelium and ganglion cell layers in retina sections of both mouse lines. hTf was abundant in MG cells. The distribution of mTf and hTf mRNA was consistent with these findings. mTf and hTf were secreted into the medium of MG cell primary cultures. Cells from Tg mice secreted hTf at a particularly high level. However, both WT and Tg cell cultures lose their ability to secrete Tf after a few passages. Tg MG cells secreting hTf were more resistant to iron-induced stress toxicity than those no longer secreted hTf. Similarly, exogenous human apo-Tf, but not human holo-Tf, conferred resistance to iron-induced stress on MG cells from WT mice. CONCLUSIONS: hTf localization in MG cells from Tg mice was reminiscent of that reported for aged human retina and age-related macular degeneration, both conditions associated with iron deposition. The role of hTf in protection against toxicity in Tg MG cells probably involves an adaptive mechanism developed in neural retina to control iron-induced stress.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adjuvant chemotherapy decisions in breast cancer are increasingly based on the pathologist's assessment of tumor proliferation. The Swiss Working Group of Gyneco- and Breast Pathologists has surveyed inter- and intraobserver consistency of Ki-67-based proliferative fraction in breast carcinomas. METHODS: Five pathologists evaluated MIB-1-labeling index (LI) in ten breast carcinomas (G1, G2, G3) by counting and eyeballing. In the same way, 15 pathologists all over Switzerland then assessed MIB-1-LI on three G2 carcinomas, in self-selected or pre-defined areas of the tumors, comparing centrally immunostained slides with slides immunostained in the different laboratoires. To study intra-observer variability, the same tumors were re-examined 4 months later. RESULTS: The Kappa values for the first series of ten carcinomas of various degrees of differentiation showed good to very good agreement for MIB-1-LI (Kappa 0.56-0.72). However, we found very high inter-observer variabilities (Kappa 0.04-0.14) in the read-outs of the G2 carcinomas. It was not possible to explain the inconsistencies exclusively by any of the following factors: (i) pathologists' divergent definitions of what counts as a positive nucleus (ii) the mode of assessment (counting vs. eyeballing), (iii) immunostaining technique, and (iv) the selection of the tumor area in which to count. Despite intensive confrontation of all participating pathologists with the problem, inter-observer agreement did not improve when the same slides were re-examined 4 months later (Kappa 0.01-0.04) and intra-observer agreement was likewise poor (Kappa 0.00-0.35). CONCLUSION: Assessment of mid-range Ki-67-LI suffers from high inter- and intra-observer variability. Oncologists should be aware of this caveat when using Ki-67-LI as a basis for treatment decisions in moderately differentiated breast carcinomas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given the multiplicity of nanoparticles (NPs), there is a requirement to develop screening strategies to evaluate their toxicity. Within the EU-funded FP7 NanoTEST project, a panel of medically relevant NPs has been used to develop alternative testing strategies of NPs used in medical diagnostics. As conventional toxicity tests cannot necessarily be directly applied to NPs in the same manner as for soluble chemicals and drugs, we determined the extent of interference of NPs with each assay process and components. In this study, we fully characterized the panel of NP suspensions used in this project (poly(lactic-co-glycolic acid)-polyethylene oxide [PLGA-PEO], TiO2, SiO2, and uncoated and oleic-acid coated Fe3O4) and showed that many NP characteristics (composition, size, coatings, and agglomeration) interfere with a range of in vitro cytotoxicity assays (WST-1, MTT, lactate dehydrogenase, neutral red, propidium iodide, (3)H-thymidine incorporation, and cell counting), pro-inflammatory response evaluation (ELISA for GM-CSF, IL-6, and IL-8), and oxidative stress detection (monoBromoBimane, dichlorofluorescein, and NO assays). Interferences were assay specific as well as NP specific. We propose how to integrate and avoid interference with testing systems as a first step of a screening strategy for biomedical NPs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In radionuclide metrology, Monte Carlo (MC) simulation is widely used to compute parameters associated with primary measurements or calibration factors. Although MC methods are used to estimate uncertainties, the uncertainty associated with radiation transport in MC calculations is usually difficult to estimate. Counting statistics is the most obvious component of MC uncertainty and has to be checked carefully, particularly when variance reduction is used. However, in most cases fluctuations associated with counting statistics can be reduced using sufficient computing power. Cross-section data have intrinsic uncertainties that induce correlations when apparently independent codes are compared. Their effect on the uncertainty of the estimated parameter is difficult to determine and varies widely from case to case. Finally, the most significant uncertainty component for radionuclide applications is usually that associated with the detector geometry. Recent 2D and 3D x-ray imaging tools may be utilized, but comparison with experimental data as well as adjustments of parameters are usually inevitable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In diffusion MRI, traditional tractography algorithms do not recover truly quantitative tractograms and the structural connectivity has to be estimated indirectly by counting the number of fiber tracts or averaging scalar maps along them. Recently, global and efficient methods have emerged to estimate more quantitative tractograms by combining tractography with local models for the diffusion signal, like the Convex Optimization Modeling for Microstructure Informed Tractography (COMMIT) framework. In this abstract, we show the importance of using both (i) proper multi-compartment diffusion models and (ii) adequate multi-shell acquisitions, in order to evaluate the accuracy and the biological plausibility of the tractograms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NlmCategory="UNASSIGNED">A version of cascaded systems analysis was developed specifically with the aim of studying quantum noise propagation in x-ray detectors. Signal and quantum noise propagation was then modelled in four types of x-ray detectors used for digital mammography: four flat panel systems, one computed radiography and one slot-scan silicon wafer based photon counting device. As required inputs to the model, the two dimensional (2D) modulation transfer function (MTF), noise power spectra (NPS) and detective quantum efficiency (DQE) were measured for six mammography systems that utilized these different detectors. A new method to reconstruct anisotropic 2D presampling MTF matrices from 1D radial MTFs measured along different angular directions across the detector is described; an image of a sharp, circular disc was used for this purpose. The effective pixel fill factor for the FP systems was determined from the axial 1D presampling MTFs measured with a square sharp edge along the two orthogonal directions of the pixel lattice. Expectation MTFs were then calculated by averaging the radial MTFs over all possible phases and the 2D EMTF formed with the same reconstruction technique used for the 2D presampling MTF. The quantum NPS was then established by noise decomposition from homogenous images acquired as a function of detector air kerma. This was further decomposed into the correlated and uncorrelated quantum components by fitting the radially averaged quantum NPS with the radially averaged EMTF(2). This whole procedure allowed a detailed analysis of the influence of aliasing, signal and noise decorrelation, x-ray capture efficiency and global secondary gain on NPS and detector DQE. The influence of noise statistics, pixel fill factor and additional electronic and fixed pattern noises on the DQE was also studied. The 2D cascaded model and decompositions performed on the acquired images also enlightened the observed quantum NPS and DQE anisotropy.