886 resultados para Sample selection and firm heterogeneity


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hepatitis A virus (HAV), the prototype of genus Hepatovirus, has several unique biological characteristics that distinguish it from other members of the Picornaviridae family. Among these, the need for an intact eIF4G factor for the initiation of translation results in an inability to shut down host protein synthesis by a mechanism similar to that of other picornaviruses. Consequently, HAV must inefficiently compete for the cellular translational machinery and this may explain its poor growth in cell culture. In this context of virus/cell competition, HAV has strategically adopted a naturally highly deoptimized codon usage with respect to that of its cellular host. With the aim to optimize its codon usage the virus was adapted to propagate in cells with impaired protein synthesis, in order to make tRNA pools more available for the virus. A significant loss of fitness was the immediate response to the adaptation process that was, however, later on recovered and more associated to a re-deoptimization rather than to an optimization of the codon usage specifically in the capsid coding region. These results exclude translation selection and instead suggest fine-tuning translation kinetics selection as the underlying mechanism of the codon usage bias in this specific genome region. Additionally, the results provide clear evidence of the Red Queen dynamics of evolution since the virus has very much evolved to re-adapt its codon usage to the environmental cellular changing conditions in order to recover the original fitness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We work out a semiclassical theory of shot noise in ballistic n+-i-n+ semiconductor structures aiming at studying two fundamental physical correlations coming from Pauli exclusion principle and long-range Coulomb interaction. The theory provides a unifying scheme which, in addition to the current-voltage characteristics, describes the suppression of shot noise due to Pauli and Coulomb correlations in the whole range of system parameters and applied bias. The whole scenario is summarized by a phase diagram in the plane of two dimensionless variables related to the sample length and contact chemical potential. Here different regions of physical interest can be identified where only Coulomb or only Pauli correlations are active, or where both are present with different relevance. The predictions of the theory are proven to be fully corroborated by Monte Carlo simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Elbow arthroplasty is increasingly performed in patients with rheumatic and post-traumatic arthritis. Data on elbow periprosthetic joint infection (PJI) are limited. We investigated the characteristics and outcome of elbow PJI in a 14-year cohort of total elbow arthroplasties in a single centre. Elbow prosthesis, which were implanted between 1994 and 2007 at Schulthess Clinic in Zurich, were retrospectively screened for infection. PJI was defined as periprosthetic purulence, the presence of sinus tract or microbial growth. A Kaplan-Meier survival method and Cox proportional hazard analysis were performed. Of 358 elbow prostheses, PJI was identified in 27 (7.5%). The median patient age (range) was 61 (39-82) years; 63% were females. Seventeen patients (63%) had a rheumatic disorder and ten (37%) had osteoarthritis. Debridement and implant retention was performed in 78%, followed by exchange or removal of the prosthesis (15%) or no surgery (7%).The relapse-free survival (95% CI) was 79% (63-95%) after 1 year and 65% (45-85%) after 2 years. The outcome after 2 years was significantly better when patients were treated according to the algorithm compared to patients who were not (100% vs. 33%, p <0.05). In 21 patients treated with debridement and retention, the cure rate was also higher when the algorithm was followed (100% vs. 11%, p <0.05). The findings of the present study suggest that the treatment algorithm developed for hip and knee PJI can be applied to elbow PJI. With proper patient selection and antimicrobial therapy, debridement and retention of the elbow prosthesis is associated with good treatment outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inbreeding avoidance is often invoked to explain observed patterns of dispersal, and theoretical models indeed point to a possibly important role. However, while inbreeding load is usually assumed constant in these models, it is actually bound to vary dynamically under the combined influences of mutation, drift, and selection and thus to evolve jointly with dispersal. Here we report the results of individual-based stochastic simulations allowing such a joint evolution. We show that strongly deleterious mutations should play no significant role, owing to the low genomic mutation rate for such mutations. Mildly deleterious mutations, by contrast, may create enough heterosis to affect the evolution of dispersal as an inbreeding-avoidance mechanism, but only provided that they are also strongly recessive. If slightly recessive, they will spread among demes and accumulate at the metapopulation level, thus contributing to mutational load, but not to heterosis. The resulting loss of viability may then combine with demographic stochasticity to promote population fluctuations, which foster indirect incentives for dispersal. Our simulations suggest that, under biologically realistic parameter values, deleterious mutations have a limited impact on the evolution of dispersal, which on average exceeds by only one-third the values expected from kin-competition avoidance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most counties have bridges that are no longer adequate, and are faced with large capital expenditure for replacement structures of the same size. In this regard, low water stream crossings (LWSCs) can provide an acceptable, low cost alternative to bridges and culverts on low volume and reduced maintenance level roads. In addition to providing a low cost option for stream crossings, LWSCs have been designed to have the additional benefit of stream bed stabilization. Considerable information on the current status of LWSCs in Iowa, along with insight of needs for design assistance, was gained from a survey of county engineers that was conducted as part of this research (Appendix A). Copies of responses and analysis are included in Appendix B. This document provides guidelines for the design of LWSCs. There are three common types of LWSCs: unvented ford, vented ford with pipes, and low water bridges. Selection among these depends on stream geometry, discharge, importance of road, and budget availability. To minimize exposure to tort liability, local agencies using low water stream crossings should consider adopting reasonable selection and design criteria and certainly provide adequate warning of these structures to road users. The design recommendations included in this report for LWSCs provide guidelines and suggestions for local agency reference. Several design examples of design calculations are included in Appendix E.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Strategies to dissect phenotypic and genetic heterogeneity of major depressive disorder (MDD) have mainly relied on subphenotypes, such as age at onset (AAO) and recurrence/episodicity. Yet, evidence on whether these subphenotypes are familial or heritable is scarce. The aims of this study are to investigate the familiality of AAO and episode frequency in MDD and to assess the proportion of their variance explained by common single nucleotide polymorphisms (SNP heritability). METHOD: For investigating familiality, we used 691 families with 2-5 full siblings with recurrent MDD from the DeNt study. We fitted (square root) AAO and episode count in a linear and a negative binomial mixed model, respectively, with family as random effect and adjusting for sex, age and center. The strength of familiality was assessed with intraclass correlation coefficients (ICC). For estimating SNP heritabilities, we used 3468 unrelated MDD cases from the RADIANT and GSK Munich studies. After similarly adjusting for covariates, derived residuals were used with the GREML method in GCTA (genome-wide complex trait analysis) software. RESULTS: Significant familial clustering was found for both AAO (ICC = 0.28) and episodicity (ICC = 0.07). We calculated from respective ICC estimates the maximal additive heritability of AAO (0.56) and episodicity (0.15). SNP heritability of AAO was 0.17 (p = 0.04); analysis was underpowered for calculating SNP heritability of episodicity. CONCLUSIONS: AAO and episodicity aggregate in families to a moderate and small degree, respectively. AAO is under stronger additive genetic control than episodicity. Larger samples are needed to calculate the SNP heritability of episodicity. The described statistical framework could be useful in future analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gradients of variation-or clines-have always intrigued biologists. Classically, they have been interpreted as the outcomes of antagonistic interactions between selection and gene flow. Alternatively, clines may also establish neutrally with isolation by distance (IBD) or secondary contact between previously isolated populations. The relative importance of natural selection and these two neutral processes in the establishment of clinal variation can be tested by comparing genetic differentiation at neutral genetic markers and at the studied trait. A third neutral process, surfing of a newly arisen mutation during the colonization of a new habitat, is more difficult to test. Here, we designed a spatially explicit approximate Bayesian computation (ABC) simulation framework to evaluate whether the strong cline in the genetically based reddish coloration observed in the European barn owl (Tyto alba) arose as a by-product of a range expansion or whether selection has to be invoked to explain this colour cline, for which we have previously ruled out the actions of IBD or secondary contact. Using ABC simulations and genetic data on 390 individuals from 20 locations genotyped at 22 microsatellites loci, we first determined how barn owls colonized Europe after the last glaciation. Using these results in new simulations on the evolution of the colour phenotype, and assuming various genetic architectures for the colour trait, we demonstrate that the observed colour cline cannot be due to the surfing of a neutral mutation. Taking advantage of spatially explicit ABC, which proved to be a powerful method to disentangle the respective roles of selection and drift in range expansions, we conclude that the formation of the colour cline observed in the barn owl must be due to natural selection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The choice to adopt risk-sensitive measurement approaches for operational risks: the case of Advanced Measurement Approach under Basel II New Capital Accord This paper investigates the choice of the operational risk approach under Basel II requirements and whether the adoption of advanced risk measurement approaches allows banks to save capital. Among the three possible approaches for operational risk measurement, the Advanced Measurement Approach (AMA) is the most sophisticated and requires the use of historical loss data, the application of statistical tools, and the engagement of a highly qualified staff. Our results provide evidence that the adoption of AMA is contingent on the availability of bank resources and prior experience in risk-sensitive operational risk measurement practices. Moreover, banks that choose AMA exhibit low requirements for capital and, as a result might gain a competitive advantage compared to banks that opt for less sophisticated approaches. - Internal Risk Controls and their Impact on Bank Solvency Recent cases in financial sector showed the importance of risk management controls on risk taking and firm performance. Despite advances in the design and implementation of risk management mechanisms, there is little research on their impact on behavior and performance of firms. Based on data from a sample of 88 banks covering the period between 2004 and 2010, we provide evidence that internal risk controls impact the solvency of banks. In addition, our results show that the level of internal risk controls leads to a higher degree of solvency in banks with a major shareholder in contrast to widely-held banks. However, the relationship between internal risk controls and bank solvency is negatively affected by BHC growth strategies and external restrictions on bank activities, while the higher regulatory requirements for bank capital moderates positively this relationship. - The Impact of the Sophistication of Risk Measurement Approaches under Basel II on Bank Holding Companies Value Previous research showed the importance of external regulation on banks' behavior. Some inefficient standards may accentuate risk-taking in banks and provoke a financial crisis. Despite the growing literature on the potential effects of Basel II rules, there is little empirical research on the efficiency of risk-sensitive capital measurement approaches and their impact on bank profitability and market valuation. Based on data from a sample of 66 banks covering the period between 2008 and 2010, we provide evidence that prudential ratios computed under Basel II standards predict the value of banks. However, this relation is contingent on the degree of sophistication of risk measurement approaches that banks apply. Capital ratios are effective in predicting bank market valuation when banks adopt the advanced approaches to compute the value of their risk-weighted assets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sampling scheme is essential in the investigation of the spatial variability of soil properties in Soil Science studies. The high costs of sampling schemes optimized with additional sampling points for each physical and chemical soil property, prevent their use in precision agriculture. The purpose of this study was to obtain an optimal sampling scheme for physical and chemical property sets and investigate its effect on the quality of soil sampling. Soil was sampled on a 42-ha area, with 206 geo-referenced points arranged in a regular grid spaced 50 m from each other, in a depth range of 0.00-0.20 m. In order to obtain an optimal sampling scheme for every physical and chemical property, a sample grid, a medium-scale variogram and the extended Spatial Simulated Annealing (SSA) method were used to minimize kriging variance. The optimization procedure was validated by constructing maps of relative improvement comparing the sample configuration before and after the process. A greater concentration of recommended points in specific areas (NW-SE direction) was observed, which also reflects a greater estimate variance at these locations. The addition of optimal samples, for specific regions, increased the accuracy up to 2 % for chemical and 1 % for physical properties. The use of a sample grid and medium-scale variogram, as previous information for the conception of additional sampling schemes, was very promising to determine the locations of these additional points for all physical and chemical soil properties, enhancing the accuracy of kriging estimates of the physical-chemical properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The structure, magnetic response, and dielectric response of the grown epitaxial thin films of the orthorhombic phase of YMnO3 oxide on Nb:SrTiO3 (001) substrates have been measured. We have found that a substrate-induced strain produces an in-plane compression of the YMnO3 unit cell. The magnetization versus temperature curves display a significant zero-field cooling (ZFC)-field cooling hysteresis below the Nel temperature (TN 45 K). The dielectric constant increases gradually (up to 26%) below the TN and mimics the ZFC magnetization curve. We argue that these effects could be a manifestation of magnetoelectric coupling in YMnO3 thin films and that the magnetic structure of YMnO3 can be controlled by substrate selection and/or growth conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract : This work is concerned with the development and application of novel unsupervised learning methods, having in mind two target applications: the analysis of forensic case data and the classification of remote sensing images. First, a method based on a symbolic optimization of the inter-sample distance measure is proposed to improve the flexibility of spectral clustering algorithms, and applied to the problem of forensic case data. This distance is optimized using a loss function related to the preservation of neighborhood structure between the input space and the space of principal components, and solutions are found using genetic programming. Results are compared to a variety of state-of--the-art clustering algorithms. Subsequently, a new large-scale clustering method based on a joint optimization of feature extraction and classification is proposed and applied to various databases, including two hyperspectral remote sensing images. The algorithm makes uses of a functional model (e.g., a neural network) for clustering which is trained by stochastic gradient descent. Results indicate that such a technique can easily scale to huge databases, can avoid the so-called out-of-sample problem, and can compete with or even outperform existing clustering algorithms on both artificial data and real remote sensing images. This is verified on small databases as well as very large problems. Résumé : Ce travail de recherche porte sur le développement et l'application de méthodes d'apprentissage dites non supervisées. Les applications visées par ces méthodes sont l'analyse de données forensiques et la classification d'images hyperspectrales en télédétection. Dans un premier temps, une méthodologie de classification non supervisée fondée sur l'optimisation symbolique d'une mesure de distance inter-échantillons est proposée. Cette mesure est obtenue en optimisant une fonction de coût reliée à la préservation de la structure de voisinage d'un point entre l'espace des variables initiales et l'espace des composantes principales. Cette méthode est appliquée à l'analyse de données forensiques et comparée à un éventail de méthodes déjà existantes. En second lieu, une méthode fondée sur une optimisation conjointe des tâches de sélection de variables et de classification est implémentée dans un réseau de neurones et appliquée à diverses bases de données, dont deux images hyperspectrales. Le réseau de neurones est entraîné à l'aide d'un algorithme de gradient stochastique, ce qui rend cette technique applicable à des images de très haute résolution. Les résultats de l'application de cette dernière montrent que l'utilisation d'une telle technique permet de classifier de très grandes bases de données sans difficulté et donne des résultats avantageusement comparables aux méthodes existantes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most counties have bridges that are no longer adequate, and are faced with large capital expenditure for replacement structures of the same size. In this regard, low water stream crossings (LWSCs) can provide an acceptable, low cost alternative to bridges and culverts on low volume and reduced maintenance level roads. In addition to providing a low cost option for stream crossings, LWSCs have been designed to have the additional benefit of streambed stabilization. Considerable information on the current status of LWSCs in Iowa, along with insight of needs for design assistance, was gained from a survey of county engineers that was conducted as part of this research (Appendix A). Copies of responses and analysis are included in Appendix B. This document provides guidelines for the design of LWSCs. There are three common types of LWSCs: unvented ford, vented ford with pipes, and low water bridges. Selection among these depends on stream geometry, discharge, importance of road, and budget availability. To minimize exposure to tort liability, local agencies using low water stream crossings should consider adopting reasonable selection and design criteria and certainly provide adequate warning of these structures to road users. The design recommendations included in this report for LWSCs provide guidelines and suggestions for local agency reference. Several design examples of design calculations are included in Appendix E.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hepatitis A virus (HAV), the prototype of genus Hepatovirus, has several unique biological characteristics that distinguish it from other members of the Picornaviridae family. Among these, the need for an intact eIF4G factor for the initiation of translation results in an inability to shut down host protein synthesis by a mechanism similar to that of other picornaviruses. Consequently, HAV must inefficiently compete for the cellular translational machinery and this may explain its poor growth in cell culture. In this context of virus/cell competition, HAV has strategically adopted a naturally highly deoptimized codon usage with respect to that of its cellular host. With the aim to optimize its codon usage the virus was adapted to propagate in cells with impaired protein synthesis, in order to make tRNA pools more available for the virus. A significant loss of fitness was the immediate response to the adaptation process that was, however, later on recovered and more associated to a re-deoptimization rather than to an optimization of the codon usage specifically in the capsid coding region. These results exclude translation selection and instead suggest fine-tuning translation kinetics selection as the underlying mechanism of the codon usage bias in this specific genome region. Additionally, the results provide clear evidence of the Red Queen dynamics of evolution since the virus has very much evolved to re-adapt its codon usage to the environmental cellular changing conditions in order to recover the original fitness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Point defects of opposite signs can alternately nucleate on the -1/2 disclination line that forms near the free surface of a confined nematic liquid crystal. We show the existence of metastable configurations consisting of periodic repetitions of such defects. These configurations are characterized by a minimal interdefect spacing that is seen to depend on sample thickness and on an applied electric field. The time evolution of the defect distribution suggests that the defects attract at small distances and repel at large distances.