61 resultados para cure fraction models
Resumo:
The computational power is increasing day by day. Despite that, there are some tasks that are still difficult or even impossible for a computer to perform. For example, while identifying a facial expression is easy for a human, for a computer it is an area in development. To tackle this and similar issues, crowdsourcing has grown as a way to use human computation in a large scale. Crowdsourcing is a novel approach to collect labels in a fast and cheap manner, by sourcing the labels from the crowds. However, these labels lack reliability since annotators are not guaranteed to have any expertise in the field. This fact has led to a new research area where we must create or adapt annotation models to handle these weaklylabeled data. Current techniques explore the annotators’ expertise and the task difficulty as variables that influences labels’ correction. Other specific aspects are also considered by noisy-labels analysis techniques. The main contribution of this thesis is the process to collect reliable crowdsourcing labels for a facial expressions dataset. This process consists in two steps: first, we design our crowdsourcing tasks to collect annotators labels; next, we infer the true label from the collected labels by applying state-of-art crowdsourcing algorithms. At the same time, a facial expression dataset is created, containing 40.000 images and respective labels. At the end, we publish the resulting dataset.
Resumo:
Real-time collaborative editing systems are common nowadays, and their advantages are widely recognized. Examples of such systems include Google Docs, ShareLaTeX, among others. This thesis aims to adopt this paradigm in a software development environment. The OutSystems visual language lends itself very appropriate to this kind of collaboration, since the visual code enables a natural flow of knowledge between developers regarding the developed code. Furthermore, communication and coordination are simplified. This proposal explores the field of collaboration on a very structured and rigid model, where collaboration is made through the copy-modify-merge paradigm, in which a developer gets its own private copy from the shared repository, modifies it in isolation and later uploads his changes to be merged with modifications concurrently produced by other developers. To this end, we designed and implemented an extension to the OutSystems Platform, in order to enable real-time collaborative editing. The solution guarantees consistency among the artefacts distributed across several developers working on the same project. We believe that it is possible to achieve a much more intense collaboration over the same models with a low negative impact on the individual productivity of each developer.
Resumo:
The development of human cell models that recapitulate hepatic functionality allows the study of metabolic pathways involved in toxicity and disease. The increased biological relevance, cost-effectiveness and high-throughput of cell models can contribute to increase the efficiency of drug development in the pharmaceutical industry. Recapitulation of liver functionality in vitro requires the development of advanced culture strategies to mimic in vivo complexity, such as 3D culture, co-cultures or biomaterials. However, complex 3D models are typically associated with poor robustness, limited scalability and compatibility with screening methods. In this work, several strategies were used to develop highly functional and reproducible spheroid-based in vitro models of human hepatocytes and HepaRG cells using stirred culture systems. In chapter 2, the isolation of human hepatocytes from resected liver tissue was implemented and a liver tissue perfusion method was optimized towards the improvement of hepatocyte isolation and aggregation efficiency, resulting in an isolation protocol compatible with 3D culture. In chapter 3, human hepatocytes were co-cultivated with mesenchymal stem cells (MSC) and the phenotype of both cell types was characterized, showing that MSC acquire a supportive stromal function and hepatocytes retain differentiated hepatic functions, stability of drug metabolism enzymes and higher viability in co-cultures. In chapter 4, a 3D alginate microencapsulation strategy for the differentiation of HepaRG cells was evaluated and compared with the standard 2D DMSO-dependent differentiation, yielding higher differentiation efficiency, comparable levels of drug metabolism activity and significantly improved biosynthetic activity. The work developed in this thesis provides novel strategies for 3D culture of human hepatic cell models, which are reproducible, scalable and compatible with screening platforms. The phenotypic and functional characterization of the in vitro systems performed contributes to the state of the art of human hepatic cell models and can be applied to the improvement of pre-clinical drug development efficiency of the process, model disease and ultimately, development of cell-based therapeutic strategies for liver failure.
Resumo:
This paper develops the model of Bicego, Grosso, and Otranto (2008) and applies Hidden Markov Models to predict market direction. The paper draws an analogy between financial markets and speech recognition, seeking inspiration from the latter to solve common issues in quantitative investing. Whereas previous works focus mostly on very complex modifications of the original hidden markov model algorithm, the current paper provides an innovative methodology by drawing inspiration from thoroughly tested, yet simple, speech recognition methodologies. By grouping returns into sequences, Hidden Markov Models can then predict market direction the same way they are used to identify phonemes in speech recognition. The model proves highly successful in identifying market direction but fails to consistently identify whether a trend is in place. All in all, the current paper seeks to bridge the gap between speech recognition and quantitative finance and, even though the model is not fully successful, several refinements are suggested and the room for improvement is significant.
Resumo:
The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.
Resumo:
Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.
Resumo:
This research is titled “The Future of Airline Business Models: Which Will Win?” and it is part of the requirements for the award of a Masters in Management from NOVA BSE and another from Luiss Guido Carlo University. The purpose is to elaborate a complete market analysis of the European Air Transportation Industry in order to predict which Airlines, strategies and business models may be successful in the next years. First, an extensive literature review of the business model concept has been done. Then, a detailed overview of the main European Airlines and the strategies that they have been implementing so far has been developed. Finally, the research is illustrated with three case studies
Resumo:
Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).
Resumo:
Contém resumo
Resumo:
Composite materials have a complex behavior, which is difficult to predict under different types of loads. In the course of this dissertation a methodology was developed to predict failure and damage propagation of composite material specimens. This methodology uses finite element numerical models created with Ansys and Matlab softwares. The methodology is able to perform an incremental-iterative analysis, which increases, gradually, the load applied to the specimen. Several structural failure phenomena are considered, such as fiber and/or matrix failure, delamination or shear plasticity. Failure criteria based on element stresses were implemented and a procedure to reduce the stiffness of the failed elements was prepared. The material used in this dissertation consist of a spread tow carbon fabric with a 0°/90° arrangement and the main numerical model analyzed is a 26-plies specimen under compression loads. Numerical results were compared with the results of specimens tested experimentally, whose mechanical properties are unknown, knowing only the geometry of the specimen. The material properties of the numerical model were adjusted in the course of this dissertation, in order to find the lowest difference between the numerical and experimental results with an error lower than 5% (it was performed the numerical model identification based on the experimental results).
Resumo:
Field lab: Business project
Resumo:
We intend to study the algebraic structure of the simple orthogonal models to use them, through binary operations as building blocks in the construction of more complex orthogonal models. We start by presenting some matrix results considering Commutative Jordan Algebras of symmetric matrices, CJAs. Next, we use these results to study the algebraic structure of orthogonal models, obtained by crossing and nesting simpler ones. Then, we study the normal models with OBS, which can also be orthogonal models. We intend to study normal models with OBS (Orthogonal Block Structure), NOBS (Normal Orthogonal Block Structure), obtaining condition for having complete and suffcient statistics, having UMVUE, is unbiased estimators with minimal covariance matrices whatever the variance components. Lastly, see ([Pereira et al. (2014)]), we study the algebraic structure of orthogonal models, mixed models whose variance covariance matrices are all positive semi definite, linear combinations of known orthogonal pairwise orthogonal projection matrices, OPOPM, and whose least square estimators, LSE, of estimable vectors are best linear unbiased estimator, BLUE, whatever the variance components, so they are uniformly BLUE, UBLUE. From the results of the algebraic structure we will get explicit expressions for the LSE of these models.
Resumo:
Both culture coverage and digital journalism are contemporary phenomena that have undergone several transformations within a short period of time. Whenever the media enters a period of uncertainty such as the present one, there is an attempt to innovate in order to seek sustainability, skip the crisis or find a new public. This indicates that there are new trends to be understood and explored, i.e., how are media innovating in a digital environment? Not only does the professional debate about the future of journalism justify the need to explore the issue, but so do the academic approaches to cultural journalism. However, none of the studies so far have considered innovation as a motto or driver and tried to explain how the media are covering culture, achieving sustainability and engaging with the readers in a digital environment. This research examines how European media which specialize in culture or have an important cultural section are innovating in a digital environment. Specifically, we see how these innovation strategies are being taken in relation to the approach to culture and dominant cultural areas, editorial models, the use of digital tools for telling stories, overall brand positioning and extensions, engagement with the public and business models. We conducted a mixed methods study combining case studies of four media projects, which integrates qualitative web features and content analysis, with quantitative web content analysis. Two major general-interest journalistic brands which started as physical newspapers – The Guardian (London, UK) and Público (Lisbon, Portugal) – a magazine specialized in international affairs, culture and design – Monocle (London, UK) – and a native digital media project that was launched by a cultural organization – Notodo, by La Fábrica – were the four case studies chosen. Findings suggest, on one hand, that we are witnessing a paradigm shift in culture coverage in a digital environment, challenging traditional boundaries related to cultural themes and scope, angles, genres, content format and delivery, engagement and business models. Innovation in the four case studies lies especially along the product dimensions (format and content), brand positioning and process (business model and ways to engage with users). On the other hand, there are still perennial values that are crucial to innovation and sustainability, such as commitment to journalism, consistency (to the reader, to brand extensions and to the advertiser), intelligent differentiation and the capability of knowing what innovation means and how it can be applied, since this thesis also confirms that one formula doesn´t suit all. Changing minds, exceeding cultural inertia and optimizing the memory of the websites, looking at them as living, organic bodies, which continuously interact with the readers in many different ways, and not as a closed collection of articles, are still the main challenges for some media.
Resumo:
Neurological disorders are a major concern in modern societies, with increasing prevalence mainly related with the higher life expectancy. Most of the current available therapeutic options can only control and ameliorate the patients’ symptoms, often be-coming refractory over time. Therapeutic breakthroughs and advances have been hampered by the lack of accurate central nervous system (CNS) models. The develop-ment of these models allows the study of the disease onset/progression mechanisms and the preclinical evaluation of novel therapeutics. This has traditionally relied on genetically engineered animal models that often diverge considerably from the human phenotype (developmentally, anatomically and physiologically) and 2D in vitro cell models, which fail to recapitulate the characteristics of the target tissue (cell-cell and cell-matrix interactions, cell polarity). The in vitro recapitulation of CNS phenotypic and functional features requires the implementation of advanced culture strategies that enable to mimic the in vivo struc-tural and molecular complexity. Models based on differentiation of human neural stem cells (hNSC) in 3D cultures have great potential as complementary tools in preclinical research, bridging the gap between human clinical studies and animal models. This thesis aimed at the development of novel human 3D in vitro CNS models by integrat-ing agitation-based culture systems and a wide array of characterization tools. Neural differentiation of hNSC as 3D neurospheres was explored in Chapter 2. Here, it was demonstrated that human midbrain-derived neural progenitor cells from fetal origin (hmNPC) can generate complex tissue-like structures containing functional dopaminergic neurons, as well as astrocytes and oligodendrocytes. Chapter 3 focused on the development of cellular characterization assays for cell aggregates based on light-sheet fluorescence imaging systems, which resulted in increased spatial resolu-tion both for fixed samples or live imaging. The applicability of the developed human 3D cell model for preclinical research was explored in Chapter 4, evaluating the poten-tial of a viral vector candidate for gene therapy. The efficacy and safety of helper-dependent CAV-2 (hd-CAV-2) for gene delivery in human neurons was evaluated, demonstrating increased neuronal tropism, efficient transgene expression and minimal toxicity. The potential of human 3D in vitro CNS models to mimic brain functions was further addressed in Chapter 5. Exploring the use of 13C-labeled substrates and Nucle-ar Magnetic Resonance (NMR) spectroscopy tools, neural metabolic signatures were evaluated showing lineage-specific metabolic specialization and establishment of neu-ron-astrocytic shuttles upon differentiation. Chapter 6 focused on transferring the knowledge and strategies described in the previous chapters for the implementation of a scalable and robust process for the 3D differentiation of hNSC derived from human induced pluripotent stem cells (hiPSC). Here, software-controlled perfusion stirred-tank bioreactors were used as technological system to sustain cell aggregation and dif-ferentiation. The work developed in this thesis provides practical and versatile new in vitro ap-proaches to model the human brain. Furthermore, the culture strategies described herein can be further extended to other sources of neural phenotypes, including pa-tient-derived hiPSC. The combination of this 3D culture strategy with the implemented characterization methods represents a powerful complementary tool applicable in the drug discovery, toxicology and disease modeling.
Resumo:
RESUMO: Este trabalho teve como objetivo a determinação de esquemas de tratamento alternativos para o carcinoma da próstata com radioterapia externa (EBRT) e braquiterapia de baixa taxa de dose (LDRBT) com implantes permanentes de Iodo-125, biologicamente equivalentes aos convencionalmente usados na prática clínica, com recurso a modelos teóricos e a métodos de Monte Carlo (MC). Os conceitos de dose biológica efetiva (BED) e de dose uniforme equivalente (EUD) foram utilizados, com o modelo linear-quadrático (LQ), para a determinação de regimes de tratamento equivalentes. Numa primeira abordagem, utilizou-se a BED para determinar: 1) esquemas hipofracionados de EBRT mantendo as complicações retais tardias de regimes convencionais com doses totais de 75,6 Gy, 77,4 Gy, 79,2 Gy e 81,0 Gy; e 2) a relação entre as doses totais de EBRT e LDRBT de modo a manter a BED do regime convencional de 45 Gy de EBRT e 110 Gy de LDRBT. Numa segunda abordagem, recorreu-se ao código de MC MCNPX para a simulação de distribuições de dose de EBRT e LDRBT em dois fantomas de voxel segmentados a partir das imagens de tomografia computorizada de pacientes com carcinoma da próstata. Os resultados das simulações de EBRT e LDRBT foram somados e determinada uma EUD total de forma a obterem-se: 1) esquemas equivalentes ao tratamento convencional de 25 frações de 1,8 Gy de EBRT em combinação com 110 Gy de LDRBT; e 2) esquemas equivalentes a EUD na próstata de 67 Gy, 72 Gy, 80 Gy, 90 Gy, 100 Gy e 110 Gy. Em todos os resultados nota-se um ganho terapêutico teórico na utilização de esquemas hipofracionados de EBRT. Para uma BED no reto equivalente ao esquema convencional, tem-se um aumento de 2% na BED da próstata com menos 5 frações. Este incremento dá-se de forma cada vez mais visível à medida que se reduz o número de frações, sendo da ordem dos 10-11% com menos 20 frações e dos 35-45% com menos 40 frações. Considerando os resultados das simulações de EBRT, obteve-se uma EUD média de 107 Gy para a próstata e de 42 Gy para o reto, com o esquema convencional de 110 Gy de LDRBT, seguidos de 25 frações de 1,8 Gy de EBRT. Em termos de probabilidade de controlo tumoral (igual EUD), é equivalente a este tratamento a administração de EBRT em 66 frações de 1,8 Gy, 56 de 2 Gy, 40 de 2,5 Gy, 31 de 3 Gy, 20 de 4 Gy ou 13 de 5 Gy. Relativamente à administração de 66 frações de 1,8 Gy, a EUD generalizada no reto reduz em 6% com o recurso a frações de 2,5 Gy e em 10% com frações de 4 Gy. Determinou-se uma BED total de 162 Gy para a administração de 25 frações de 1,8 Gy de EBRT em combinação com 110 Gy de LDRBT. Variando-se a dose total de LDRBT (TDLDRBT) em função da dose total de EBRT (TDEBRT), de modo a garantir uma BED de 162 Gy, obteve-se a seguinte relação:.......... Os resultados das simulações mostram que a EUD no reto diminui com o aumento da dose total de LDRBT para dose por fração de EBRT (dEBRT) inferiores a 2, Gy e aumenta para dEBRT a partir dos 3 Gy. Para quantidades de TDLDRBT mais baixas (<50 Gy), o reto beneficia de frações maiores de EBRT. À medida que se aumenta a TDLDRBT, a EUD generalizada no reto torna-se menos dependente da dEBRT. Este trabalho mostra que é possível a utilização de diferentes regimes de tratamento para o carcinoma da próstata com radioterapia que possibilitem um ganho terapêutico, quer seja administrando uma maior dose biológica com efeitos tardios constantes, quer mantendo a dose no tumor e diminuindo a toxicidade retal. A utilização com precaução de esquemas hipofracionados de EBRT, para além do benefício terapêutico, pode trazer vantagens ao nível da conveniência para o paciente e economia de custos. Os resultados das simulações deste estudo e conversão para doses de efeito biológico para o tratamento do carcinoma da próstata apresentam linhas de orientação teórica de interesse para novos ensaios clínicos. --------------------------------------------------ABSTRACT: The purpose of this work was to determine alternative radiotherapy regimens for the treatment of prostate cancer using external beam radiotherapy (EBRT) and low dose-rate brachytherapy (LDRBT) with Iodine-125 permanent implants which are biologically equivalent to conventional clinical treatments, by the use of theoretical models and Monte Carlo techniques. The concepts of biological effective dose (BED) and equivalent uniform dose (EUD), together with the linear-quadratic model (LQ), were used for determining equivalent treatment regimens. In a first approach, the BED concept was used to determine: 1) hypofractionated schemes of EBRT maintaining late rectal complications as with the conventional regimens with total doses of 75.6 Gy, 77.4 Gy, 79.2 Gy and 81.0 Gy; and 2) the relationship between total doses of EBRT and LDRBT in order to keep the BED of the conventional treatment of 45 Gy of EBRT and 110 Gy of LDRBT. In a second approach, the MC code MCNPX was used for simulating dose distributions of EBRT and LDRBT in two voxel phantoms segmented from the computed tomography of patients with prostate cancer. The results of the simulations of EBRT and LDRBT were added up and given an overall EUD in order to obtain: 1) equivalent to conventional treatment regimens of 25 fraction of 1.8 Gy of EBRT in combination with 110Gy of LDRBT; and 2) equivalent schemes of EUD of 67 Gy, 72 Gy, 80 Gy, 90 Gy, 100 Gy, and 110Gy to the prostate. In all the results it is noted a therapeutic gain using hypofractionated EBRT schemes. For a rectal BED equivalent to the conventional regimen, an increment of 2% in the prostate BED was achieved with less 5 fractions. This increase is visibly higher as the number of fractions decrease, amounting 10-11% with less 20 fractions and 35-45% with less 20 fractions. Considering the results of the EBRT simulations an average EUD of 107 Gy was achieved for the prostate and of 42 Gy for the rectum with the conventional scheme of 110 Gy of LDRBT followed by 25 fractions of 1.8 Gy of EBRT. In terms of tumor control probability (same EUD) it is equivalent to this treatment, for example, delivering the EBRT in 66 fractions of 1.8 Gy, 56 fractions of 2 Gy, 40 fractions of 2.5 Gy, 31 fractions of 3 Gy, 20 fractions of 4 Gy or 13 fractions of 5 Gy. Regarding the use of 66 fractions of 1.8 Gy, the rectum EUD is reduced to 6% with 2.5 Gy per fraction and to 10% with 4 Gy. A total BED of 162 Gy was achieved for the delivery of 25 fractions of 1.8 Gy of EBRT in combination with 110 Gy of LDRBT. By varying the total dose of LDRBT (TDLDRBT) with the total dose of EBRT (TDEBRT) so as to ensure a BED of 162 Gy, the following relationship was obtained: ....... The simulation results show that the rectum EUD decreases with the increase of the TDLDRBT, for EBRT dose per fracion (dEBRT) less than 2.5 Gy and increases for dEBRT above 3 Gy. For lower amounts of TDLDRBT (< 50Gy), the rectum benefits of larger EBRT fractions. As the TDLDRBT increases, the rectum gEUD becomes less dependent on the dEBRT. The use of different regimens which enable a therapeutic gain, whether deivering a higher dose with the same late biological effects or maintaining the dose to the tumor and reducing rectal toxicity is possible. The use with precaution of hypofractionated regimens, in addition to the therapeutic benefit, can bring advantages in terms of convenience for the patient and cost savings. The simulation results of this study together with the biological dose conversion for the treatment of prostate cancer serve as guidelines of interest for new clinical trials.