<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="pt-BR">
	<id>https://wiki.if.ufrgs.br/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Gjfeller</id>
	<title>Instituto de Física - UFRGS - Contribuições do usuário [pt-br]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.if.ufrgs.br/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Gjfeller"/>
	<link rel="alternate" type="text/html" href="https://wiki.if.ufrgs.br/index.php/Especial:Contribui%C3%A7%C3%B5es/Gjfeller"/>
	<updated>2026-04-06T11:26:19Z</updated>
	<subtitle>Contribuições do usuário</subtitle>
	<generator>MediaWiki 1.39.4</generator>
	<entry>
		<id>https://wiki.if.ufrgs.br/index.php?title=Cluster&amp;diff=2219</id>
		<title>Cluster</title>
		<link rel="alternate" type="text/html" href="https://wiki.if.ufrgs.br/index.php?title=Cluster&amp;diff=2219"/>
		<updated>2026-03-31T18:17:58Z</updated>

		<summary type="html">&lt;p&gt;Gjfeller: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Cluster Lovelace - Instituto de Física UFRGS =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The cluster is located at Instituto de Física da UFRGS, in Porto Alegre.&lt;br /&gt;
&lt;br /&gt;
== Management Committee ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The cluster is managed by professors representing the fields of Astronomy, Theoretical Physics, and Experimental Physics, in addition to an IT department employee from the Physics Institute.&lt;br /&gt;
&lt;br /&gt;
Astronomy: Rogério Riffel&lt;br /&gt;
&lt;br /&gt;
Theoretical Physics: Leonardo Brunnet&lt;br /&gt;
&lt;br /&gt;
Experimental Physics: Pedro Grande&lt;br /&gt;
&lt;br /&gt;
IT employee: Gustavo Feller&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Users Committee ==&lt;br /&gt;
&lt;br /&gt;
Users have two channels for communication/discussion: &lt;br /&gt;
&lt;br /&gt;
1) The fis-linux-if@grupos.ufrgs.br mailing list&lt;br /&gt;
&lt;br /&gt;
2) Direct messages to the IT department in https://www.if.ufrgs.br/if/informatica/requisitar-servicos.&lt;br /&gt;
&lt;br /&gt;
== Infraestruture ==&lt;br /&gt;
&lt;br /&gt;
=== Management Software ===&lt;br /&gt;
&lt;br /&gt;
The system of queues and scheduling of tasks is controlled by the [https://slurm.schedmd.com/ Slurm Workload Manager].&lt;br /&gt;
&lt;br /&gt;
Account request: https://www1.ufrgs.br/CatalogoServicos/servicos/acesso-servico?servico=6003&amp;amp;moldura=N&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Number of jobs per user controlled on demand.&lt;br /&gt;
&lt;br /&gt;
Number of users on 1/24/2023: 150&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hardware in lovelace nodes ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CPU: Ryzen (32 and 2*24 cores) + AMD 16 cores&lt;br /&gt;
RAM: 64 GB each&lt;br /&gt;
GPU: Three nodes with NVIDIA CUDA&lt;br /&gt;
Storage: storage Dell 12TB &lt;br /&gt;
Conection inter-nodes: Gigabit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Installed Software ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
OS: Debian 12 &lt;br /&gt;
Basic packages installed:&lt;br /&gt;
gcc&lt;br /&gt;
gfortran&lt;br /&gt;
python: torch, numba&lt;br /&gt;
julia&lt;br /&gt;
conda&lt;br /&gt;
compucel3d&lt;br /&gt;
espresso&lt;br /&gt;
gromacs&lt;br /&gt;
lammps&lt;br /&gt;
mesa&lt;br /&gt;
openmpi&lt;br /&gt;
povray&lt;br /&gt;
quantum-espresso&lt;br /&gt;
vasp&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Rules for scheduling, access control, and usage of the research infrastructure ==&lt;br /&gt;
&lt;br /&gt;
=== Online scheduling ===&lt;br /&gt;
&lt;br /&gt;
The cluster is accessible using the  UFRGS virtual prived network ([https://www1.ufrgs.br/CatalogoServicos/servicos/servico?servico=3178 vpn]) through server lovelace.if.ufrgs.br. &lt;br /&gt;
&lt;br /&gt;
To access through a unix-like system use:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
ssh &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Under windows you may configure winscp to enter the address lovelace.if.ufrgs.br.&lt;br /&gt;
&lt;br /&gt;
If you are not registered, ask for registration sending an email to fisica-ti@ufrgs.br&lt;br /&gt;
&lt;br /&gt;
=== Using softwares in the cluster ===&lt;br /&gt;
&lt;br /&gt;
To execute a software in a cluster job this program must:&lt;br /&gt;
&lt;br /&gt;
1. Be already installed&lt;br /&gt;
 &lt;br /&gt;
OR&lt;br /&gt;
&lt;br /&gt;
2. Be copied to the user home &lt;br /&gt;
&lt;br /&gt;
Ex:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
scp my_programm &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br:~/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are compiling your program in the cluster, one option is to use &amp;lt;code&amp;gt;gcc&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Ex:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
scp -r source-code/ usuario@lovelace.if.ufrgs.br:~/&lt;br /&gt;
ssh &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br:~/&lt;br /&gt;
cd source-code&lt;br /&gt;
gcc main.c funcoes.c&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
This will generate file &amp;lt;code&amp;gt;a.out&amp;lt;/code&amp;gt;, which is the executable.&lt;br /&gt;
&lt;br /&gt;
Being accessible by methods 1 or 2, the program can be executed in the cluster through one &amp;lt;strong&amp;gt;JOB&amp;lt;/strong&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
OBS: If you execute your executable without submitting as &amp;lt;strong&amp;gt;JOB&amp;lt;/strong&amp;gt;, it will be executed in the server, not in the nodes. This is not recommended since the server computational capabilities are limited and you will be slowing down the server for everyone else.&lt;br /&gt;
&lt;br /&gt;
=== Criating and executing a Job ===&lt;br /&gt;
&lt;br /&gt;
Slurm manages jobs and each job represents a program or task being executed.&lt;br /&gt;
&lt;br /&gt;
To submit a new job, you must create a script file describing the requisites and characteristics of the Job.&lt;br /&gt;
&lt;br /&gt;
A typical example of the content of a submission script is below&lt;br /&gt;
&lt;br /&gt;
Ex: &amp;lt;code&amp;gt;job.sh&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash &lt;br /&gt;
#SBATCH -n 1 # Number of cpus to be allocated (Despite the # these SBATCH lines are compiled by the slurm manager! Just after the second # they are comments. To comment the whole line put ## at the beginning.)&lt;br /&gt;
#SBATCH -N 1 # Nummber of nodes to be allocated  (You don't have to use all requisites, comment with ##)&lt;br /&gt;
#SBATCH -t 0-00:05 # Limit execution time (D-HH:MM)&lt;br /&gt;
#SBATCH -p long # Partition to be submitted&lt;br /&gt;
#SBATCH --qos qos_long # QOS &lt;br /&gt;
  &lt;br /&gt;
# Your program execution commands&lt;br /&gt;
./a.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In option --qos, use the partition name with &amp;quot;qos_&amp;quot; prefix:&lt;br /&gt;
&lt;br /&gt;
partition: short -&amp;gt; qos: qos_short -&amp;gt; limit  2 weeks&lt;br /&gt;
&lt;br /&gt;
partition: long -&amp;gt; qos: qos_long -&amp;gt; limit de 3 month&lt;br /&gt;
  &lt;br /&gt;
If you run on GPU, specify the &amp;quot;generic resource&amp;quot; gpu in cluster ada:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash &lt;br /&gt;
#SBATCH -n 1 &lt;br /&gt;
#SBATCH -N 1&lt;br /&gt;
#SBATCH -t 0-00:05 &lt;br /&gt;
#SBATCH -p long &lt;br /&gt;
#SBATCH --qos qos_long # QOS &lt;br /&gt;
#SBATCH --gres=gpu:1&lt;br /&gt;
  &lt;br /&gt;
# Comandos de execução do seu programa:&lt;br /&gt;
./a.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To ask for a specific gpu:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#SBATCH --constraint=&amp;quot;gtx970&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To submit the job, execute:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sbatch job.sh&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Usefull commands ==&lt;br /&gt;
* To list jobs:&lt;br /&gt;
  squeue&lt;br /&gt;
&lt;br /&gt;
* To list all jobs running in the cluster now:&lt;br /&gt;
  sudo squeue&lt;br /&gt;
&lt;br /&gt;
* To delete a running job:&lt;br /&gt;
  scancel [job_id]&lt;br /&gt;
&lt;br /&gt;
* To list available partitions:&lt;br /&gt;
  sinfo&lt;br /&gt;
&lt;br /&gt;
* To list gpu's in the nodes:&lt;br /&gt;
  sinfo -o &amp;quot;%N %f&amp;quot;&lt;br /&gt;
&lt;br /&gt;
* To list characteristic of all nodes:&lt;br /&gt;
  sinfo -Nel&lt;/div&gt;</summary>
		<author><name>Gjfeller</name></author>
	</entry>
	<entry>
		<id>https://wiki.if.ufrgs.br/index.php?title=Cluster&amp;diff=2218</id>
		<title>Cluster</title>
		<link rel="alternate" type="text/html" href="https://wiki.if.ufrgs.br/index.php?title=Cluster&amp;diff=2218"/>
		<updated>2026-03-31T17:45:58Z</updated>

		<summary type="html">&lt;p&gt;Gjfeller: /* Users Committee */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Cluster Lovelace - Instituto de Física UFRGS =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The cluster is located at Instituto de Física da UFRGS, in Porto Alegre.&lt;br /&gt;
&lt;br /&gt;
== Management Committee ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The cluster is managed by professors representing the fields of Astronomy, Theoretical Physics, and Experimental Physics, in addition to an IT department employee from the Physics Institute.&lt;br /&gt;
&lt;br /&gt;
Astronomy: Rogério Riffel&lt;br /&gt;
&lt;br /&gt;
Theoretical Physics: Leonardo Brunnet&lt;br /&gt;
&lt;br /&gt;
Experimental Physics: Pedro Grande&lt;br /&gt;
&lt;br /&gt;
IT employee: Gustavo Feller&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Users Committee ==&lt;br /&gt;
&lt;br /&gt;
Users have two channels for communication/discussion: &lt;br /&gt;
&lt;br /&gt;
1) The fis-linux-if@grupos.ufrgs.br mailing list&lt;br /&gt;
&lt;br /&gt;
2) Direct messages to the IT department in https://www.if.ufrgs.br/if/informatica/requisitar-servicos.&lt;br /&gt;
&lt;br /&gt;
== Infraestruture ==&lt;br /&gt;
&lt;br /&gt;
=== Management Software ===&lt;br /&gt;
&lt;br /&gt;
The system of queues and scheduling of tasks is controlled by the [https://slurm.schedmd.com/ Slurm Workload Manager].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Number of jobs per user controlled on demand.&lt;br /&gt;
&lt;br /&gt;
Number of users on 1/24/2023: 150&lt;br /&gt;
&lt;br /&gt;
Account request: mail to fisica-ti@ufrgs.br&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hardware in lovelace nodes ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CPU: Ryzen (32 and 2*24 cores) + AMD 16 cores&lt;br /&gt;
RAM: 64 GB each&lt;br /&gt;
GPU: Three nodes with NVIDIA CUDA&lt;br /&gt;
Storage: storage Dell 12TB &lt;br /&gt;
Conection inter-nodes: Gigabit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Installed Software ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
OS: Debian 12 &lt;br /&gt;
Basic packages installed:&lt;br /&gt;
gcc&lt;br /&gt;
gfortran&lt;br /&gt;
python: torch, numba&lt;br /&gt;
julia&lt;br /&gt;
conda&lt;br /&gt;
compucel3d&lt;br /&gt;
espresso&lt;br /&gt;
gromacs&lt;br /&gt;
lammps&lt;br /&gt;
mesa&lt;br /&gt;
openmpi&lt;br /&gt;
povray&lt;br /&gt;
quantum-espresso&lt;br /&gt;
vasp&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Rules for scheduling, access control, and usage of the research infrastructure ==&lt;br /&gt;
&lt;br /&gt;
=== Online scheduling ===&lt;br /&gt;
&lt;br /&gt;
The cluster is accessible using the  UFRGS virtual prived network ([https://www1.ufrgs.br/CatalogoServicos/servicos/servico?servico=3178 vpn]) through server lovelace.if.ufrgs.br. &lt;br /&gt;
&lt;br /&gt;
To access through a unix-like system use:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
ssh &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Under windows you may configure winscp to enter the address lovelace.if.ufrgs.br.&lt;br /&gt;
&lt;br /&gt;
If you are not registered, ask for registration sending an email to fisica-ti@ufrgs.br&lt;br /&gt;
&lt;br /&gt;
=== Using softwares in the cluster ===&lt;br /&gt;
&lt;br /&gt;
To execute a software in a cluster job this program must:&lt;br /&gt;
&lt;br /&gt;
1. Be already installed&lt;br /&gt;
 &lt;br /&gt;
OR&lt;br /&gt;
&lt;br /&gt;
2. Be copied to the user home &lt;br /&gt;
&lt;br /&gt;
Ex:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
scp my_programm &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br:~/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are compiling your program in the cluster, one option is to use &amp;lt;code&amp;gt;gcc&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Ex:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
scp -r source-code/ usuario@lovelace.if.ufrgs.br:~/&lt;br /&gt;
ssh &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br:~/&lt;br /&gt;
cd source-code&lt;br /&gt;
gcc main.c funcoes.c&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
This will generate file &amp;lt;code&amp;gt;a.out&amp;lt;/code&amp;gt;, which is the executable.&lt;br /&gt;
&lt;br /&gt;
Being accessible by methods 1 or 2, the program can be executed in the cluster through one &amp;lt;strong&amp;gt;JOB&amp;lt;/strong&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
OBS: If you execute your executable without submitting as &amp;lt;strong&amp;gt;JOB&amp;lt;/strong&amp;gt;, it will be executed in the server, not in the nodes. This is not recommended since the server computational capabilities are limited and you will be slowing down the server for everyone else.&lt;br /&gt;
&lt;br /&gt;
=== Criating and executing a Job ===&lt;br /&gt;
&lt;br /&gt;
Slurm manages jobs and each job represents a program or task being executed.&lt;br /&gt;
&lt;br /&gt;
To submit a new job, you must create a script file describing the requisites and characteristics of the Job.&lt;br /&gt;
&lt;br /&gt;
A typical example of the content of a submission script is below&lt;br /&gt;
&lt;br /&gt;
Ex: &amp;lt;code&amp;gt;job.sh&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash &lt;br /&gt;
#SBATCH -n 1 # Number of cpus to be allocated (Despite the # these SBATCH lines are compiled by the slurm manager! Just after the second # they are comments. To comment the whole line put ## at the beginning.)&lt;br /&gt;
#SBATCH -N 1 # Nummber of nodes to be allocated  (You don't have to use all requisites, comment with ##)&lt;br /&gt;
#SBATCH -t 0-00:05 # Limit execution time (D-HH:MM)&lt;br /&gt;
#SBATCH -p long # Partition to be submitted&lt;br /&gt;
#SBATCH --qos qos_long # QOS &lt;br /&gt;
  &lt;br /&gt;
# Your program execution commands&lt;br /&gt;
./a.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In option --qos, use the partition name with &amp;quot;qos_&amp;quot; prefix:&lt;br /&gt;
&lt;br /&gt;
partition: short -&amp;gt; qos: qos_short -&amp;gt; limit  2 weeks&lt;br /&gt;
&lt;br /&gt;
partition: long -&amp;gt; qos: qos_long -&amp;gt; limit de 3 month&lt;br /&gt;
  &lt;br /&gt;
If you run on GPU, specify the &amp;quot;generic resource&amp;quot; gpu in cluster ada:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash &lt;br /&gt;
#SBATCH -n 1 &lt;br /&gt;
#SBATCH -N 1&lt;br /&gt;
#SBATCH -t 0-00:05 &lt;br /&gt;
#SBATCH -p long &lt;br /&gt;
#SBATCH --qos qos_long # QOS &lt;br /&gt;
#SBATCH --gres=gpu:1&lt;br /&gt;
  &lt;br /&gt;
# Comandos de execução do seu programa:&lt;br /&gt;
./a.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To ask for a specific gpu:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#SBATCH --constraint=&amp;quot;gtx970&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To submit the job, execute:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sbatch job.sh&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Usefull commands ==&lt;br /&gt;
* To list jobs:&lt;br /&gt;
  squeue&lt;br /&gt;
&lt;br /&gt;
* To list all jobs running in the cluster now:&lt;br /&gt;
  sudo squeue&lt;br /&gt;
&lt;br /&gt;
* To delete a running job:&lt;br /&gt;
  scancel [job_id]&lt;br /&gt;
&lt;br /&gt;
* To list available partitions:&lt;br /&gt;
  sinfo&lt;br /&gt;
&lt;br /&gt;
* To list gpu's in the nodes:&lt;br /&gt;
  sinfo -o &amp;quot;%N %f&amp;quot;&lt;br /&gt;
&lt;br /&gt;
* To list characteristic of all nodes:&lt;br /&gt;
  sinfo -Nel&lt;/div&gt;</summary>
		<author><name>Gjfeller</name></author>
	</entry>
	<entry>
		<id>https://wiki.if.ufrgs.br/index.php?title=Cluster&amp;diff=2188</id>
		<title>Cluster</title>
		<link rel="alternate" type="text/html" href="https://wiki.if.ufrgs.br/index.php?title=Cluster&amp;diff=2188"/>
		<updated>2025-02-26T15:08:47Z</updated>

		<summary type="html">&lt;p&gt;Gjfeller: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Cluster Lovelace - Instituto de Física UFRGS =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The cluster is located at Instituto de Física da UFRGS, in Porto Alegre.&lt;br /&gt;
&lt;br /&gt;
== Management Committee ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The cluster is managed by professors representing the fields of Astronomy, Theoretical Physics, and Experimental Physics, in addition to an IT department employee from the Physics Institute.&lt;br /&gt;
&lt;br /&gt;
Astronomy: Rogério Riffel&lt;br /&gt;
&lt;br /&gt;
Theoretical Physics: Leonardo Brunnet&lt;br /&gt;
&lt;br /&gt;
Experimental Physics: Pedro Grande&lt;br /&gt;
&lt;br /&gt;
IT employee: Gustavo Feller&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Users Committee ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Users have two channels for communication/discussion: &lt;br /&gt;
&lt;br /&gt;
1) The fis-linux-if@grupos.ufrgs.br mailing list&lt;br /&gt;
&lt;br /&gt;
2) Direct messages to the IT department via the email fisica-ti@ufrgs.br.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Infraestruture ==&lt;br /&gt;
&lt;br /&gt;
=== Management Software ===&lt;br /&gt;
&lt;br /&gt;
The system of queues and scheduling of tasks is controlled by the [https://slurm.schedmd.com/ Slurm Workload Manager].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Number of jobs per user controlled on demand.&lt;br /&gt;
&lt;br /&gt;
Number of users on 1/24/2023: 150&lt;br /&gt;
&lt;br /&gt;
Account request: mail to fisica-ti@ufrgs.br&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hardware in lovelace nodes ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CPU: Ryzen (32 and 2*24 cores) + AMD 16 cores&lt;br /&gt;
RAM: 64 GB each&lt;br /&gt;
GPU: Three nodes with NVIDIA CUDA&lt;br /&gt;
Storage: storage Dell 12TB &lt;br /&gt;
Conection inter-nodes: Gigabit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Installed Software ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
OS: Debian 12 &lt;br /&gt;
Basic packages installed:&lt;br /&gt;
gcc&lt;br /&gt;
gfortran&lt;br /&gt;
python: torch, numba&lt;br /&gt;
julia&lt;br /&gt;
conda&lt;br /&gt;
compucel3d&lt;br /&gt;
espresso&lt;br /&gt;
gromacs&lt;br /&gt;
lammps&lt;br /&gt;
mesa&lt;br /&gt;
openmpi&lt;br /&gt;
povray&lt;br /&gt;
quantum-espresso&lt;br /&gt;
vasp&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Rules for scheduling, access control, and usage of the research infrastructure ==&lt;br /&gt;
&lt;br /&gt;
=== Online scheduling ===&lt;br /&gt;
&lt;br /&gt;
The cluster is accessible using the  UFRGS virtual prived network ([https://www1.ufrgs.br/CatalogoServicos/servicos/servico?servico=3178 vpn]) through server lovelace.if.ufrgs.br. &lt;br /&gt;
&lt;br /&gt;
To access through a unix-like system use:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
ssh &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Under windows you may configure winscp to enter the address lovelace.if.ufrgs.br.&lt;br /&gt;
&lt;br /&gt;
If you are not registered, ask for registration sending an email to fisica-ti@ufrgs.br&lt;br /&gt;
&lt;br /&gt;
=== Using softwares in the cluster ===&lt;br /&gt;
&lt;br /&gt;
To execute a software in a cluster job this program must:&lt;br /&gt;
&lt;br /&gt;
1. Be already installed&lt;br /&gt;
 &lt;br /&gt;
OR&lt;br /&gt;
&lt;br /&gt;
2. Be copied to the user home &lt;br /&gt;
&lt;br /&gt;
Ex:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
scp my_programm &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br:~/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are compiling your program in the cluster, one option is to use &amp;lt;code&amp;gt;gcc&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Ex:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
scp -r source-code/ usuario@lovelace.if.ufrgs.br:~/&lt;br /&gt;
ssh &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br:~/&lt;br /&gt;
cd source-code&lt;br /&gt;
gcc main.c funcoes.c&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
This will generate file &amp;lt;code&amp;gt;a.out&amp;lt;/code&amp;gt;, which is the executable.&lt;br /&gt;
&lt;br /&gt;
Being accessible by methods 1 or 2, the program can be executed in the cluster through one &amp;lt;strong&amp;gt;JOB&amp;lt;/strong&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
OBS: If you execute your executable without submitting as &amp;lt;strong&amp;gt;JOB&amp;lt;/strong&amp;gt;, it will be executed in the server, not in the nodes. This is not recommended since the server computational capabilities are limited and you will be slowing down the server for everyone else.&lt;br /&gt;
&lt;br /&gt;
=== Criating and executing a Job ===&lt;br /&gt;
&lt;br /&gt;
Slurm manages jobs and each job represents a program or task being executed.&lt;br /&gt;
&lt;br /&gt;
To submit a new job, you must create a script file describing the requisites and characteristics of the Job.&lt;br /&gt;
&lt;br /&gt;
A typical example of the content of a submission script is below&lt;br /&gt;
&lt;br /&gt;
Ex: &amp;lt;code&amp;gt;job.sh&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash &lt;br /&gt;
#SBATCH -n 1 # Number of cpus to be allocated (Despite the # these SBATCH lines are compiled by the slurm manager! Just after the second # they are comments. To comment the whole line put ## at the beginning.)&lt;br /&gt;
#SBATCH -N 1 # Nummber of nodes to be allocated  (You don't have to use all requisites, comment with ##)&lt;br /&gt;
#SBATCH -t 0-00:05 # Limit execution time (D-HH:MM)&lt;br /&gt;
#SBATCH -p long # Partition to be submitted&lt;br /&gt;
#SBATCH --qos qos_long # QOS &lt;br /&gt;
  &lt;br /&gt;
# Your program execution commands&lt;br /&gt;
./a.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In option --qos, use the partition name with &amp;quot;qos_&amp;quot; prefix:&lt;br /&gt;
&lt;br /&gt;
partition: short -&amp;gt; qos: qos_short -&amp;gt; limit  2 weeks&lt;br /&gt;
&lt;br /&gt;
partition: long -&amp;gt; qos: qos_long -&amp;gt; limit de 3 month&lt;br /&gt;
  &lt;br /&gt;
If you run on GPU, specify the &amp;quot;generic resource&amp;quot; gpu in cluster ada:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash &lt;br /&gt;
#SBATCH -n 1 &lt;br /&gt;
#SBATCH -N 1&lt;br /&gt;
#SBATCH -t 0-00:05 &lt;br /&gt;
#SBATCH -p long &lt;br /&gt;
#SBATCH --qos qos_long # QOS &lt;br /&gt;
#SBATCH --gres=gpu:1&lt;br /&gt;
  &lt;br /&gt;
# Comandos de execução do seu programa:&lt;br /&gt;
./a.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To ask for a specific gpu:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#SBATCH --constraint=&amp;quot;gtx970&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To submit the job, execute:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sbatch job.sh&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Usefull commands ==&lt;br /&gt;
* To list jobs:&lt;br /&gt;
  squeue&lt;br /&gt;
&lt;br /&gt;
* To list all jobs running in the cluster now:&lt;br /&gt;
  sudo squeue&lt;br /&gt;
&lt;br /&gt;
* To delete a running job:&lt;br /&gt;
  scancel [job_id]&lt;br /&gt;
&lt;br /&gt;
* To list available partitions:&lt;br /&gt;
  sinfo&lt;br /&gt;
&lt;br /&gt;
* To list gpu's in the nodes:&lt;br /&gt;
  sinfo -o &amp;quot;%N %f&amp;quot;&lt;br /&gt;
&lt;br /&gt;
* To list characteristic of all nodes:&lt;br /&gt;
  sinfo -Nel&lt;/div&gt;</summary>
		<author><name>Gjfeller</name></author>
	</entry>
	<entry>
		<id>https://wiki.if.ufrgs.br/index.php?title=Cluster&amp;diff=2187</id>
		<title>Cluster</title>
		<link rel="alternate" type="text/html" href="https://wiki.if.ufrgs.br/index.php?title=Cluster&amp;diff=2187"/>
		<updated>2025-02-26T14:59:27Z</updated>

		<summary type="html">&lt;p&gt;Gjfeller: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Cluster Lovelace - Instituto de Física UFRGS =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The cluster is located at Instituto de Física da UFRGS, in Porto Alegre.&lt;br /&gt;
&lt;br /&gt;
== Management Committee ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The cluster is managed by professors representing the fields of Astronomy, Theoretical Physics, and Experimental Physics, in addition to an IT department employee from the Physics Institute.&lt;br /&gt;
&lt;br /&gt;
Astronomy: Rogério Riffel&lt;br /&gt;
&lt;br /&gt;
Theoretical Physics: Leonardo Brunnet&lt;br /&gt;
&lt;br /&gt;
Experimental Physics: Pedro Grande&lt;br /&gt;
&lt;br /&gt;
TI employee: Gustavo Feller&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Users Committee ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Users have two channels for communication/discussion: &lt;br /&gt;
&lt;br /&gt;
1) The fis-linux-if@grupos.ufrgs.br mailing list&lt;br /&gt;
&lt;br /&gt;
2) Direct messages to the IT department via the email fisica-ti@ufrgs.br.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Infraestruture ==&lt;br /&gt;
&lt;br /&gt;
=== Management Software ===&lt;br /&gt;
&lt;br /&gt;
The system of queues and scheduling of tasks is controlled by the [https://slurm.schedmd.com/ Slurm Workload Manager].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Number of jobs per user controlled on demand.&lt;br /&gt;
&lt;br /&gt;
Number of users on 1/24/2023: 150&lt;br /&gt;
&lt;br /&gt;
Account request: mail to fisica-ti@ufrgs.br&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hardware in lovelace nodes ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CPU: Ryzen (32 and 2*24 cores) + AMD 16 cores&lt;br /&gt;
RAM: 64 GB each&lt;br /&gt;
GPU: Three nodes with NVIDIA CUDA&lt;br /&gt;
Storage: storage Dell 12TB &lt;br /&gt;
Conection inter-nodes: Gigabit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Installed Software ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
OS: Debian 12 &lt;br /&gt;
Basic packages installed:&lt;br /&gt;
gcc&lt;br /&gt;
gfortran&lt;br /&gt;
python: torch, numba&lt;br /&gt;
julia&lt;br /&gt;
conda&lt;br /&gt;
compucel3d&lt;br /&gt;
espresso&lt;br /&gt;
gromacs&lt;br /&gt;
lammps&lt;br /&gt;
mesa&lt;br /&gt;
openmpi&lt;br /&gt;
povray&lt;br /&gt;
quantum-espresso&lt;br /&gt;
vasp&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Rules for scheduling, access control, and usage of the research infrastructure ==&lt;br /&gt;
&lt;br /&gt;
=== Online scheduling ===&lt;br /&gt;
&lt;br /&gt;
The cluster is accessible using the  UFRGS virtual prived network ([https://www1.ufrgs.br/CatalogoServicos/servicos/servico?servico=3178 vpn]) through server lovelace.if.ufrgs.br. &lt;br /&gt;
&lt;br /&gt;
To access through a unix-like system use:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
ssh &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Under windows you may configure winscp to enter the address lovelace.if.ufrgs.br.&lt;br /&gt;
&lt;br /&gt;
If you are not registered, ask for registration sending an email to fisica-ti@ufrgs.br&lt;br /&gt;
&lt;br /&gt;
=== Using softwares in the cluster ===&lt;br /&gt;
&lt;br /&gt;
To execute a software in a cluster job this program must:&lt;br /&gt;
&lt;br /&gt;
1. Be already installed&lt;br /&gt;
 &lt;br /&gt;
OR&lt;br /&gt;
&lt;br /&gt;
2. Be copied to the user home &lt;br /&gt;
&lt;br /&gt;
Ex:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
scp my_programm &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br:~/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you are compiling your program in the cluster, one option is to use &amp;lt;code&amp;gt;gcc&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Ex:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
scp -r source-code/ usuario@lovelace.if.ufrgs.br:~/&lt;br /&gt;
ssh &amp;lt;user&amp;gt;@lovelace.if.ufrgs.br:~/&lt;br /&gt;
cd source-code&lt;br /&gt;
gcc main.c funcoes.c&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
This will generate file &amp;lt;code&amp;gt;a.out&amp;lt;/code&amp;gt;, which is the executable.&lt;br /&gt;
&lt;br /&gt;
Being accessible by methods 1 or 2, the program can be executed in the cluster through one &amp;lt;strong&amp;gt;JOB&amp;lt;/strong&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
OBS: If you execute your executable without submitting as &amp;lt;strong&amp;gt;JOB&amp;lt;/strong&amp;gt;, it will be executed in the server, not in the nodes. This is not recommended since the server computational capabilities are limited and you will be slowing down the server for everyone else.&lt;br /&gt;
&lt;br /&gt;
=== Criating and executing a Job ===&lt;br /&gt;
&lt;br /&gt;
Slurm manages jobs and each job represents a program or task being executed.&lt;br /&gt;
&lt;br /&gt;
To submit a new job, you must create a script file describing the requisites and characteristics of the Job.&lt;br /&gt;
&lt;br /&gt;
A typical example of the content of a submission script is below&lt;br /&gt;
&lt;br /&gt;
Ex: &amp;lt;code&amp;gt;job.sh&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash &lt;br /&gt;
#SBATCH -n 1 # Number of cpus to be allocated (Despite the # these SBATCH lines are compiled by the slurm manager! Just after the second # they are comments. To comment the whole line put ## at the beginning.)&lt;br /&gt;
#SBATCH -N 1 # Nummber of nodes to be allocated  (You don't have to use all requisites, comment with ##)&lt;br /&gt;
#SBATCH -t 0-00:05 # Limit execution time (D-HH:MM)&lt;br /&gt;
#SBATCH -p long # Partition to be submitted&lt;br /&gt;
#SBATCH --qos qos_long # QOS &lt;br /&gt;
  &lt;br /&gt;
# Your program execution commands&lt;br /&gt;
./a.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In option --qos, use the partition name with &amp;quot;qos_&amp;quot; prefix:&lt;br /&gt;
&lt;br /&gt;
partition: short -&amp;gt; qos: qos_short -&amp;gt; limit  2 weeks&lt;br /&gt;
&lt;br /&gt;
partition: long -&amp;gt; qos: qos_long -&amp;gt; limit de 3 month&lt;br /&gt;
  &lt;br /&gt;
If you run on GPU, specify the &amp;quot;generic resource&amp;quot; gpu in cluster ada:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash &lt;br /&gt;
#SBATCH -n 1 &lt;br /&gt;
#SBATCH -N 1&lt;br /&gt;
#SBATCH -t 0-00:05 &lt;br /&gt;
#SBATCH -p long &lt;br /&gt;
#SBATCH --qos qos_long # QOS &lt;br /&gt;
#SBATCH --gres=gpu:1&lt;br /&gt;
  &lt;br /&gt;
# Comandos de execução do seu programa:&lt;br /&gt;
./a.out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To ask for a specific gpu:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#SBATCH --constraint=&amp;quot;gtx970&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To submit the job, execute:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sbatch job.sh&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Usefull commands ==&lt;br /&gt;
* To list jobs:&lt;br /&gt;
  squeue&lt;br /&gt;
&lt;br /&gt;
* To list all jobs running in the cluster now:&lt;br /&gt;
  sudo squeue&lt;br /&gt;
&lt;br /&gt;
* To delete a running job:&lt;br /&gt;
  scancel [job_id]&lt;br /&gt;
&lt;br /&gt;
* To list available partitions:&lt;br /&gt;
  sinfo&lt;br /&gt;
&lt;br /&gt;
* To list gpu's in the nodes:&lt;br /&gt;
  sinfo -o &amp;quot;%N %f&amp;quot;&lt;br /&gt;
&lt;br /&gt;
* To list characteristic of all nodes:&lt;br /&gt;
  sinfo -Nel&lt;/div&gt;</summary>
		<author><name>Gjfeller</name></author>
	</entry>
	<entry>
		<id>https://wiki.if.ufrgs.br/index.php?title=Laborat%C3%B3rio_de_Implanta%C3%A7%C3%A3o_I%C3%B4nica&amp;diff=2096</id>
		<title>Laboratório de Implantação Iônica</title>
		<link rel="alternate" type="text/html" href="https://wiki.if.ufrgs.br/index.php?title=Laborat%C3%B3rio_de_Implanta%C3%A7%C3%A3o_I%C3%B4nica&amp;diff=2096"/>
		<updated>2023-09-22T20:10:06Z</updated>

		<summary type="html">&lt;p&gt;Gjfeller: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;big&amp;gt;'''Bem-vindo !'''&amp;lt;/big&amp;gt; &lt;br /&gt;
&lt;br /&gt;
[[Imagem:english.png|thumb|50px|left ]][[Ion Implantation Laboratory - IF UFRGS | For english version click here ]]&lt;br /&gt;
&lt;br /&gt;
[[Imagem:Implan6c.jpg|thumb|700px|center|Laboratório de Implantação de Íons.]]&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
=  Avisos  =&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''TERMO DE COMPROMISSO - USUÁRIOS EXTERNOS'''  [http://www.if.ufrgs.br/~grande/TermodeCompromisso.pdf] &lt;br /&gt;
&lt;br /&gt;
'''REGRAS PARA PUBLICAÇÕES : AFFILIATION and ACKNOWLEDGEMENT'''  [http://www.if.ufrgs.br/~grande/AffiliationandAcknowledgement.pdf] &lt;br /&gt;
&lt;br /&gt;
'''REGRAS DE USO DO LABORATÓRIO'''  [http://www.if.ufrgs.br/~grande/regras.pdf] &lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
= Informação para usuários brasileiros =&lt;br /&gt;
&lt;br /&gt;
Para usuários brasileiros que desejam se inscrever para o horário do feixe, clique aqui [http://www.ufrgs.br/lii]&lt;br /&gt;
&lt;br /&gt;
'''Tutorial passo a passo''' clique aqui   [http://www.if.ufrgs.br/~grande/TUTORIAL-SITE-IMPLANTADOR.pdf]&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
= Informação para usuários externos =&lt;br /&gt;
&lt;br /&gt;
Para usuários internacionais que desejam solicitar o horário do feixe, envie um e-mail para o Dr. Raquel Giulian (raquel.giulian@ufrgs.br) com as seguintes informações:&lt;br /&gt;
&lt;br /&gt;
- Nome, endereço e afiliação.&lt;br /&gt;
&lt;br /&gt;
- Quais técnicas serão utilizadas, para quais finalidades. Os resultados previstos  ou os resultados obtidos por outras técnicas podem ser mostrados aqui para justificar o tempo de máquina solicitado. Se alguém do grupo tiver experiência anterior com a técnica solicitada, mencione-a aqui.&lt;br /&gt;
&lt;br /&gt;
- Uma lista de amostras a serem analisadas e um resumo dos resultados esperados.&lt;br /&gt;
&lt;br /&gt;
- Publicações prévias do grupo.&lt;br /&gt;
&lt;br /&gt;
As solicitações de tempo de transferência de usuários externos serão avaliadas por um comitê local e serão classificadas de acordo com a qualidade da proposta, viabilidade e possíveis resultados.&lt;br /&gt;
&lt;br /&gt;
[[Arquivo:premium.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
O Laboratório de Implantação Iônica é uma plataforma multiusuária IE-MULTI atendida pelo Programa PREMIUM - PROPESQ.&lt;br /&gt;
&lt;br /&gt;
Infra-estrutura: &lt;br /&gt;
Aceleradores de íons, sala limpa, fornos de recozimento, oficina mecânica e microscópio eletrônico de varredura&lt;br /&gt;
&lt;br /&gt;
Horário de atendimento dos bolsistas do Programa Premium:&lt;br /&gt;
&lt;br /&gt;
[[Arquivo:horarios2.jpg]]&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
PORTAL DE ACCELERADORES IAEA - Accelerator Knowledge Portal [http://nucleus.iaea.org/sites/accelerators/Pages/default.aspx]&lt;br /&gt;
&lt;br /&gt;
Veja o blog do Instituto Nacional de Engenharia de Superfícies [ http://bit.ly/MAdmsG]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
'''BIENNIAL REPORT 2019/20''' do Laboratório  [http://www.if.ufrgs.br/~grande/arep2020.pdf]&lt;br /&gt;
&lt;br /&gt;
'''BIENNIAL REPORT 2017/18''' do Laboratório  [http://www.if.ufrgs.br/~grande/arep2018.pdf]&lt;br /&gt;
&lt;br /&gt;
'''BIENNIAL REPORT 2015/16''' do Laboratório  [http://www.if.ufrgs.br/~grande/arep2016.pdf]&lt;br /&gt;
&lt;br /&gt;
'''BIENNIAL REPORT 2013/14''' do Laboratório  [http://www.if.ufrgs.br/~grande/arep2014.pdf]&lt;br /&gt;
&lt;br /&gt;
ANNUAL REPORT 2012' do Laboratório  [http://www.if.ufrgs.br/~grande/arep2012.pdf]&lt;br /&gt;
&lt;br /&gt;
ANNUAL REPORT 2011 do Laboratório  [http://www.if.ufrgs.br/~grande/arep2011.pdf]&lt;br /&gt;
&lt;br /&gt;
ANNUAL REPORT 2010 do Laboratório  [http://www.if.ufrgs.br/~grande/arep2010.pdf]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
Fotos do Laboratório em http://implantador.multiply.com&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==  CRONOGRAMA (sujeito a modificações) ==&lt;br /&gt;
&lt;br /&gt;
-&amp;gt;Para acessar, clicar nos respectivos links abaixo:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!--&lt;br /&gt;
&lt;br /&gt;
''' Microscópio de Varredura MEV ''' &lt;br /&gt;
&lt;br /&gt;
[https://docs.google.com/spreadsheet/ccc?key=0AhORlm-2AyY5dEhqbzBOTjVkVHpldjhQUmt4UzNGYkE&amp;amp;usp=sharing CRONOGRAMA]&lt;br /&gt;
&lt;br /&gt;
--&amp;gt;&lt;br /&gt;
 &lt;br /&gt;
''' IMPLANTADOR 500 kV '''&lt;br /&gt;
&lt;br /&gt;
[https://www.ufrgs.br/lii/index.php?page=cronograma]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
''' TANDETRON 3 MV '''&lt;br /&gt;
&lt;br /&gt;
[https://www.ufrgs.br/lii/index.php?page=cronograma]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Reserva - Sala de Seminários ==&lt;br /&gt;
&lt;br /&gt;
A sala de seminários do Laboratório de Implantação Iônica é utilizada para seminários, cursos e reuniões. A programação de utilização permanente da sala é mostrada na tabela abaixo.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''PROGRAMAÇÃO PERMANENTE (Ano/Semestre: 2023/01)'''&lt;br /&gt;
&lt;br /&gt;
{| style=&amp;quot;text-align: left; width: 100%&amp;quot; border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;2&amp;quot; cellpadding=&amp;quot;2&amp;quot;&lt;br /&gt;
| Horário&lt;br /&gt;
| Segunda-feira&lt;br /&gt;
| Terça-feira&lt;br /&gt;
| Quarta-feira&lt;br /&gt;
| Quinta-feira&lt;br /&gt;
| Sexta-feira&lt;br /&gt;
|-&lt;br /&gt;
| 8:30 - 9:00&lt;br /&gt;
| - &lt;br /&gt;
| -&lt;br /&gt;
| - &lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 9:00 - 9:30&lt;br /&gt;
| - &lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| - &lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 9:30 - 10:00   &lt;br /&gt;
| - &lt;br /&gt;
| -&lt;br /&gt;
| - &lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 10:00 - 10:30&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| - &lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
|- &lt;br /&gt;
| 10:30 - 11:00&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 11:00 - 11:30&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 11:30 - 12:00&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 12:00 - 12:30&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| Reunião de grupo - HEPSim&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 12:30 - 13:00   &lt;br /&gt;
| - &lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| Reunião de grupo - HEPSim&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 13:00 - 13:30&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| Reunião de grupo - HEPSim&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 13:30 - 14:00&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| MIC01 da PGMICRO - Henri&lt;br /&gt;
| -&lt;br /&gt;
| Reunião de grupo - Johnny&lt;br /&gt;
|-&lt;br /&gt;
| 14:00 - 14:30&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| MIC01 da PGMICRO - Henri&lt;br /&gt;
| -&lt;br /&gt;
| Reunião de grupo - Johnny&lt;br /&gt;
|-&lt;br /&gt;
| 14:30 - 15:00&lt;br /&gt;
| Reunião PowerMEIS - Pedro&lt;br /&gt;
| Curso de leitura PG - Pedro&lt;br /&gt;
| MIC01 da PGMICRO - Henri&lt;br /&gt;
| -&lt;br /&gt;
| Reunião de grupo - Johnny&lt;br /&gt;
|-&lt;br /&gt;
| 15:00 - 15:30&lt;br /&gt;
| Reunião PowerMEIS - Pedro&lt;br /&gt;
| Curso de leitura PG - Pedro&lt;br /&gt;
| MIC01 da PGMICRO - Henri&lt;br /&gt;
| Reunião LII / Seminários&lt;br /&gt;
| Reunião de grupo - Johnny&lt;br /&gt;
|-&lt;br /&gt;
| 15:30 - 16:00   &lt;br /&gt;
| Reunião - Johnny&lt;br /&gt;
| Curso de leitura PG - Pedro&lt;br /&gt;
| MIC01 da PGMICRO - Henri&lt;br /&gt;
| Reunião LII / Seminários&lt;br /&gt;
| Reunião de grupo - Johnny&lt;br /&gt;
|-&lt;br /&gt;
| 16:00 - 16:30   &lt;br /&gt;
| Reunião - Johnny&lt;br /&gt;
| -&lt;br /&gt;
| MIC01 da PGMICRO - Henri&lt;br /&gt;
| Reunião LII / Seminários&lt;br /&gt;
| Reunião de grupo - Johnny&lt;br /&gt;
|-&lt;br /&gt;
| 16:30 - 17:00&lt;br /&gt;
| Reunião MEIS - Pedro&lt;br /&gt;
| -&lt;br /&gt;
| MIC01 da PGMICRO - Henri&lt;br /&gt;
| Reunião LII / Seminários&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 17:00 - 17:30&lt;br /&gt;
| Reunião MEIS - Pedro&lt;br /&gt;
| -&lt;br /&gt;
| MIC01 da PGMICRO - Henri&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 17:30 - 18:30&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
|-&lt;br /&gt;
| 18:30 - 19:00&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
| -&lt;br /&gt;
|- &lt;br /&gt;
&lt;br /&gt;
|} &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
É possível reservar a sala para eventos esporádicos nos horários livres. As reservas esporádicas já efetuadas são as seguintes:&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''PROGRAMAÇÃO ESPORÁDICA (Ano/Semestre: 2023/01)'''&lt;br /&gt;
&lt;br /&gt;
{| style=&amp;quot;text-align: left; width: 100%&amp;quot; border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;2&amp;quot; cellpadding=&amp;quot;2&amp;quot;&lt;br /&gt;
| Dia&lt;br /&gt;
| Horário&lt;br /&gt;
| Responsável&lt;br /&gt;
| Observação&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|31/01/2023 (Terça-feira)&lt;br /&gt;
|15:30 - 18:00&lt;br /&gt;
|Cláudio Radtke&lt;br /&gt;
|Aula&lt;br /&gt;
|-&lt;br /&gt;
|DD/MM/AAAA (SSSSS)&lt;br /&gt;
|HH:MM - HH:MM&lt;br /&gt;
|NNNN&lt;br /&gt;
|OOOO&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|} &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Para efetuar uma reserva, entre em contato com o Prof. Raul (raul@if.ufrgs.br   ramal 6551).&lt;br /&gt;
&amp;lt;br /&amp;gt;&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
= Fotos =&lt;br /&gt;
[[Imagem:festajunina.jpg|thumb|500px|mais fotos em http://implantador.multiply.com]]&lt;br /&gt;
&lt;br /&gt;
 http://implantador.multiply.com&lt;br /&gt;
&lt;br /&gt;
= Histórico =&lt;br /&gt;
O Laboratório de Implantação Iônica do Instituto de Física da UFRGS surgiu em 1980 por iniciativa do Prof. Fernando C. Zawislak e de colegas que até então trabalhavam nas áreas de correlação angular perturbada e física nuclear. As atividades do Laboratório de Implantação Iônica tiveram início com a aquisição do acelerador de 400 kV em 1981 (e 1996 ampliado para 500 kV), com recursos da FINEP. Em 1989 recebemos um implantador de íons de 250 kV como doação da IBM (USA), que está dedicado a aplicações em microeletrônica. Antes do fim da década de 1990, o Laboratório de Implantação Iônica já era uma das facilidades de pesquisa de sucesso da UFRGS, não somente pela produção científica qualificada, mas também pela formação de doutores e mestres, e pelo intenso intercâmbio internacional.&lt;br /&gt;
[[Imagem:placa1.jpg|thumb|170px|left|Implantador]]&lt;br /&gt;
[[Imagem:placa2.jpg|thumb|170px|right|Acelerador Tandetron]]&lt;br /&gt;
&lt;br /&gt;
Como resultado deste sucesso, o grupo de Implantação Iônica recebeu um novo auxílio da FINEP, que permitiu a aquisição, em janeiro de 1995, de um acelerador TANDEM de 3MV, instalado em novo prédio, e em operação desde dezembro de 1996.&lt;br /&gt;
&lt;br /&gt;
Com estes três aceleradores, o Laboratório tem condições de produzir feixes de praticamente todos os isótopos estáveis da tabela periódica, permitindo uma atuação que cobre, além da pesquisa em física, muitas áreas da ciência dos materiais. Por outro lado, as duas máquinas de maior energia, e principalmente o acelerador Tandem de 3 MV, são instrumentos eficiêntes na análise de materiais, superfícies, interfaces e filmes finos, através das técnicas de Espectrometria de Retroespalhamento Rutherford (RBS), Canalização, Análise por Reações Nucleares (NRA), Análise por Detecção de Recuo Elástico (ERDA), Emissão de Raios-X Induzida por Partículas (PIXE) e Espalhamento, todas elas disponíveis no Laboratório de Implantação Iônica do Instituto de Física da Universidade Federal do Rio Grande do Sul.&lt;br /&gt;
&lt;br /&gt;
=Linhas de Pesquisa=&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
*''1. Interação de íons com a matéria:''&lt;br /&gt;
&lt;br /&gt;
- estudo da excitação de plasmons com feixes moleculares;&lt;br /&gt;
&lt;br /&gt;
- estudos de explosão Coulombiana de moleculas e perfilometria absoluta;&lt;br /&gt;
&lt;br /&gt;
- determinação de poderes de freamento em de diversos íons em alvos semi-leves;&lt;br /&gt;
&lt;br /&gt;
- estudo da dispersão de energia (straggling) de diferente tipo de íons em filmes de HfO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;, ZrO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt; e Al&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;O&amp;lt;sub&amp;gt;3&amp;lt;/sub&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
*''2. Física de dispositivos semicondutores:''                          &lt;br /&gt;
 &lt;br /&gt;
- processos tecnológicos;&lt;br /&gt;
&lt;br /&gt;
- defeitos em semicondutores criados durante o processamento;&lt;br /&gt;
&lt;br /&gt;
- síntese de novos materiais eletrônicos;&lt;br /&gt;
&lt;br /&gt;
- medidas elétricas em micro e nano-estruturas.    &lt;br /&gt;
 &lt;br /&gt;
*''3. Fotoluminescência emitida por nanocristais de Si e Ge formados por implantação iônica e recozimento a alta temperatura em matrizes de SiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt; e Si&amp;lt;sub&amp;gt;3&amp;lt;/sub&amp;gt;N&amp;lt;sub&amp;gt;4&amp;lt;/sub&amp;gt;.''&lt;br /&gt;
 &lt;br /&gt;
*''4. Estudo dos processos de formação e da estrutura de nanocristais de íons (Sn, Pb, Ge) implantados em SiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;/Si e SiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;/SiN/Si.'' &lt;br /&gt;
Filmes de SiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt; crescidos sobre Si e sobre SiN/Si, são implantados com íons de Sn, Ge e Pb e submetidos à tratamento térmico para formar nanoestruturas no interior do SiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt; e nas interfaces. A caracterização microestrutural é feita com as técnicas de RBS, TEM e HRTEM. Usando tratamentos térmicos a diferentes T e diferentes atmosferas, é possível “fabricar” nanoilhas de diversos tamanhos e localizações. Resultados interessantes recentemente obtidos mostram que é possível obter nanocristais de Si unicamente nas interfaces SiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;/Si e SiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;/SiN. Estes resultados tem potencial aplicação na moderna tecnologia de informação (filtros óticos, memórias “flash”, etc).&lt;br /&gt;
 &lt;br /&gt;
*''5. Investigação de filmes dielétricos nanoscópicos para utilização como dielétrico de porta em dispositivos semicondutores avançados. '' &lt;br /&gt;
&lt;br /&gt;
-Processamento térmico. Estudo de transporte atômico. Passivação com hidrogênio e deutério. Estruturas MOS. Métodos de perfilometria com resolução sub-nanométrica por espalhamento de íons de energias intermediárias e por reações nucleares estreitas em baixas energias. Quantificação e perfilometria do hidrogênio por reação nuclear e dissolução química passo a passo.&lt;br /&gt;
 &lt;br /&gt;
- Filmes dielétricos crescidos termicamente sobre carbeto de silício. &lt;br /&gt;
Investigar os mecanismos de crescimento através da mobilidade das espécies O, N, H, C e Si em filmes dielétricos durante o tratamento térmico de c-SiC em atmosferas oxidantes, nitretantes e/ou passivantes utilizando traçagem isotópica e análise com resolução subnanométrica por feixes de íons (NRA, NRP, RBS, LEIS, SIMS). Caracterizar a interface   formada pelo filme dielétrico  e o substrato de SiC por XPS e AFM. Modelar esses processos.    &lt;br /&gt;
 &lt;br /&gt;
- Filmes dielétricos alternativos depositados sobre carbeto de silício.&lt;br /&gt;
Investigar a estabilidade térmica através da mobilidade das espécies: metal, O, H, C e Si em filmes dielétricos alternativos ao SiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt; depositados sobre lâminas de carbeto de silício monocristalino durante o tratamento térmico em atmosferas oxidantes (O&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;), inertes (Ar, vácuo) e/ou passivantes (NO, H&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;) utilizando traçagem isotópica e análise com alta resolução por feixes de íons (principalmente reações nucleares). Caracterizar a interface formada por XPS, SIMS e AFM.&lt;br /&gt;
 &lt;br /&gt;
*''6. Caracterização Físico-Química de Nanoestruturas de Materiais Semicondutores.'' O principal objetivo desse projeto é uma descrição físico-química detalhada de nanocristais semicondutores. Em tal descrição, serão   abordados aspectos relacionados à alteração de sua superfície por tratamentos de passivação e funcionalização, além de sua exposição a ambientes oxidantes.    &lt;br /&gt;
 &lt;br /&gt;
*''7. Determinação da composição elementar de alimentos e estudo de mecanismos de transporte iônico em proteínas e células utilizando a técnica de PIXE (Particle-Induced X-ray Emission). '' &lt;br /&gt;
&lt;br /&gt;
*''8. Efeitos da irradiação iônica nas propriedades estruturais de filmes de carbono amorfo hidrogenado''&lt;br /&gt;
Íons de diferentes átomos e energias são usados para estudar os efeitos da irradiação em filmes de a-C:H bem como em filmes com N e F incorporados durante os processos de deposição. Os estudos são realizados com as técnicas de RBS, reação nuclear, Raman, AFM, nanoindentação e FTIR. Os resultados obtidos mostram que a irradiação modifica a estrutura, a dureza e o módulo de Young dos filmes. Talvez, o resultado mais importante obtido recentemente é o efeito da irradiação sobre a tensão interna, que para doses mais elevadas de íons (N&amp;lt;sup&amp;gt;+&amp;lt;/sup&amp;gt; e Xe&amp;lt;sup&amp;gt;+&amp;lt;/sup&amp;gt;) é nula. Face a diminuição significativa da tensão interna, atualmente estamos investigando propriedades de adesão e delaminação dos filmes irradiados.     &lt;br /&gt;
 &lt;br /&gt;
*''9. Análise de composição química e distribuição em profundidade dos diversos elementos componentes de nanolaminados de compostos metálicos e multicamadas nanoscópicos. '' &lt;br /&gt;
Métodos de perfilometria com resolução sub-nanométrica por espalhamento de íons de energias intermediárias e por reações nucleares estreitas em baixas energias. Quantificação e perfilometria do hidrogênio por reação nuclear e dissolução química passo a passo.&lt;br /&gt;
 &lt;br /&gt;
*''10. Implantação iônica de sistemas metálicos, poliméricos e cerâmicos por imersão em plasma.'' &lt;br /&gt;
Caracterização mecânica, eletroquímica e tribológica dos sistemas implantados.&lt;br /&gt;
 &lt;br /&gt;
*''11. Revestimentos protetores nanoestruturados.'' &lt;br /&gt;
Preparação e caracterização física e mecânica de multicamadas nanoestruturadas de óxidos refratários (Al&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;O&amp;lt;sub&amp;gt;3&amp;lt;/sub&amp;gt;,   TiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;, ZrO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;) para revestimento de ferramentas de corte destinadas a usinagem a seco. Investigação das relações entre composição, estrutura e propriedades mecânicas.&lt;br /&gt;
 &lt;br /&gt;
*''12. Aplicações da implantação iônica na otimização da estabilidade térmica da microestrutura de contatos e interconexões de Al e Cu em dispositivos microeletrônicos. '' &lt;br /&gt;
Os estudos nesta área enfocam o desenvolvimento de pesquisa sistemática, visando a evolução da microestrutura de filmes finos de Al e Cu simulando interconexões metálicas em dispositivos de microeletrônica. Os efeitos da T e da operação em altas densidades de corrente (MA.cm&amp;lt;sup&amp;gt;-2&amp;lt;/sup&amp;gt;) combinam-se aumentando a taxa de criação de vacâncias, direcionando o fluxo de átomos intersticiais através das colisões elétron de condução átomo da rede cristalina (da interconexão). Este fato cria regiões ricas em vacâncias, que crescem até provocar a ruptura da interconexão (falha por   eletromigração). Os efeitos da modificação de nanoestrutura das interconexões produzidas por implantação iônica e/ou por processos de tratamento térmico serão agora estudados medindo a condutividade elétrica destas interconexões (testes   de vida média).&lt;br /&gt;
 &lt;br /&gt;
*''13. Nanofios Semicondutores (síntese, modificação, caracterização e aplicações): '' &lt;br /&gt;
&lt;br /&gt;
- síntese de nanofios verticalmente alinhados de ZnO através do método vapor-líquido-sólido (VLS) em substratos de Al&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;O&amp;lt;sub&amp;gt;3&amp;lt;/sub&amp;gt;, Si, SiO&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;, ITO/Si, GaN/Si;&lt;br /&gt;
&lt;br /&gt;
- dopagem de nanofios de ZnO por implantação iônica;&lt;br /&gt;
&lt;br /&gt;
- caracterização de nanofios de ZnO por SEM, HRTEM, AFM, PL, XRD;&lt;br /&gt;
&lt;br /&gt;
- medidas elétricas em nanofios individuais e grupos de fios, dopados e não dopados;&lt;br /&gt;
&lt;br /&gt;
-&amp;quot;field emission&amp;quot; em nanofios de ZnO crescidos em diferentes padrões;&lt;br /&gt;
- sensoriamento de luz e gás a base de nanofios de ZnO.&lt;br /&gt;
 &lt;br /&gt;
*''14 Formação de padrões metálicos para o crescimento seletivo de nanofios semicondutores utilizando técnicas de litografia: '' &lt;br /&gt;
&lt;br /&gt;
- litografia óptica;&lt;br /&gt;
&lt;br /&gt;
- nanolitografia por AFM;&lt;br /&gt;
&lt;br /&gt;
- nanolitografia por íons individuais;&lt;br /&gt;
&lt;br /&gt;
- litografia por micro-feixe iônico.&lt;br /&gt;
&lt;br /&gt;
= Organograma =&lt;br /&gt;
[[Imagem:Organograma2015.png|thumb|700px|Organograma do Laboratório de Implantação Iônica - Reunião 17/03/2015 ]]&lt;br /&gt;
&lt;br /&gt;
'''Coordenador Geral''' - Prof. Pedro Luis Grande&lt;br /&gt;
&lt;br /&gt;
'''Vice-coordenador Geral''' - Prof. Paulo. F.  P. Fichtner&lt;br /&gt;
&lt;br /&gt;
'''Coordenador dos Aceleradores''' - Prof. Johnny F. Dias&lt;br /&gt;
&lt;br /&gt;
'''Vice-coordenador dos Aceleradores''' - Prof. Livio Amaral&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Comitê Gestor'''&lt;br /&gt;
&lt;br /&gt;
Pedro Luis Grande,&lt;br /&gt;
Johnny F. Dias,&lt;br /&gt;
Paulo F. P. Fichtner&lt;br /&gt;
Livio Amaral,&lt;br /&gt;
Henri Boudinov,&lt;br /&gt;
Jonder Morais,&lt;br /&gt;
Fernanda Stedile,&lt;br /&gt;
Cláudio Radtke&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Sub-comitê Gestor - aceleradores'''&lt;br /&gt;
&lt;br /&gt;
Johnny F. Dias,&lt;br /&gt;
Agostinho Bulla,&lt;br /&gt;
Livio Amaral&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Comissão de usuários e RAU'''&lt;br /&gt;
&lt;br /&gt;
José Henrique R. dos Santos,&lt;br /&gt;
Raquel Giulian,&lt;br /&gt;
Leandro Araújo&lt;br /&gt;
&lt;br /&gt;
= Pessoal =&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font face=&amp;quot;sans-serif&amp;quot;&amp;gt;'''PESQUISADORES'''&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/6168905373490043 Fernando Claudio Zawislak], Dr. (IF, UFRGS, 1967) - Fundador e Coordenador Geral do Grupo desde 1980 até 2008 [mailto:zawislak@if.ufrgs.br] &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/1888879974374014 Moni Behar], Dr. (UBA, ARGENTINA, 1970) - Coordenador dos Aceleradores desde 1982 até 2010 [mailto:behar@if.ufrgs.br] &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/7503942249157750 Israel Jacob Rabin Baumvol], Dr. (IF, UFRGS, 1977) [mailto:israel@if.ufrgs.br] &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/1600461423386842 Livio Amaral], Dr. (IF, UFRGS, 1982)  [mailto:amaral@if.ufrgs.br ]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/3107128880946249 Paulo Fernando Papaleo Fichtner], Dr. (IF, UFRGS, 1987) [mailto:paulo.fichtner@ufrgs.br] &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/6996978543675989 Pedro Luís Grande], Dr. (IF, UFRGS, 1989)- Coordenador Geral do Grupo a partir de 2009 [http://www.researcherid.com/rid/F-4065-2010 Researcher ID][mailto:grande@if.ufrgs.br] &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/1038716347038971 Johnny Ferraz Dias], Dr. (UG, BÉLGICA, 1994)- Coordenador dos Aceleradores a partir de 2010 [mailto:jfdias@if.ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/7456235597127456 Henri Ivanov Boudinov] , Dr. (IE-BAN, BULGÁRIA, 1991) [mailto:henry@if.ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/8312079399403127 Fernanda Chiarello Stedile], Dr. (IQ, UFRGS, 1994) [mailto:fernanda.stedile@ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/9282310402722246  Rafael Peretti Pezzi], Dr. (IF, UFRGS, 2009) [mailto:pezzi@if.ufrgs.br]&amp;lt;/font&amp;gt;  [[Caderno de Laboratório - Rafael Pezzi|Caderno de Laboratório]]&lt;br /&gt;
&lt;br /&gt;
[http://lattes.cnpq.br/8533340072601019 &amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Raul Carlos Fadanelli Filho&amp;lt;/font&amp;gt;]&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, Dr. (IF, UFRGS, 2005) [mailto:raul@if.ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/1933730859000512 Ricardo Meurer Papaléo], Dr. (U.UPPSALA, SUÉCIA, 1996) -  PUC-RS [mailto:papaleo@pucrs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/1019763100629841 Rogério Luis Maltez], Dr.(IF, UFRGS, 1997) [mailto:maltez@if.ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/1465526203054126 Jonder Morais], Dr.(IF, UFRGS, 1999) [mailto:jonder@if.ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/4839018758765203 Claudio Radtke], Dr. (IF, UFRGS, 2003) [mailto:claudiog@iq.ufrgs.br]]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/8736619776465513 Cristiano Krug], Dr. (IF, UFRGS, 2003) [mailto:cristiano.krug@ufrgs.br]]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/7255799414696127 Daniel Lorscheitter Baptista], Dr. (IF, UFRGS, 2003) [mailto:dbaptista@gmail.com]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/2584028342156200  Gabriel Viera Soares], Dr. (IF, UFRGS, 2008) [mailto:gabriel.soares@ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/1640810725203285  Leandro Langie Araujo], Dr. (IF, UFRGS, 2004) [mailto:leandro.langie@ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/6103809034908461  Raquel Giulian], Dr. (RSPE, ANU, AUSTRÁLIA, 2009) [mailto:raquel.giulian@ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/9505103753673163  José Henrique Rodrigues dos Santos], Dr. (IF, UFRGS 1997) [mailto:zeheros@if.ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/5087940395812427   Agenor Hentz da Silva], Dr. (IF,UFRGS, 2008) [mailto:agenor@gmail.com]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;[http://lattes.cnpq.br/5186282788317424  Silma Alberton Corrêa], Dr. (IQ,UFRGS, 2013) [mailto:	silma.alberton@ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font face=&amp;quot;sans-serif&amp;quot;&amp;gt;'''PESQUISADORES ASSOCIADOS'''&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[http://lattes.cnpq.br/1357421038233208 &amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Douglas Langie da Silva&amp;lt;/font&amp;gt;]&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, Dr. (IF, UFRGS, 2004) - Colaborador, UFPel [mailto:douglaslangie@gmail.com]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[http://lattes.cnpq.br/7323260281207063 &amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Eduardo Ceretta Moreira&amp;lt;/font&amp;gt;]&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, Dr (IF, UFRGS, 2000) - Colaborador, UNIPAMPA [mailto:eduardomoreira@unipampa.edu.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[http://lattes.cnpq.br/2296402907146130 &amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Irene Teresinha Santos Garcia&amp;lt;/font&amp;gt;]&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, Dr (IF, UFRGS, 2001) - Colaborador, UFPEL [mailto:irenetsgarcia@yahoo.com.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[http://lattes.cnpq.br/1961170773106816 &amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Uilson Schwantz Sias&amp;lt;/font&amp;gt;]&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, Dr. (IF, UFRGS, 2006) - Colaborador, CEFET-RS [mailto:uilson@cefetrs.tche.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[http://lattes.cnpq.br/2375490528098026 &amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Felipe Kremer&amp;lt;/font&amp;gt;]&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, Dr. (IF, UFRGS, 2010) - Colaborador, ANU, Austrália  [mailto:kremer.felipe@gmail.com]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[http://lattes.cnpq.br/5747875882259575 &amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Carla Eliete Iochims dos Santos&amp;lt;/font&amp;gt;]&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, Dr. (IF, UFRGS, 2011) - Colaborador, FURG [mailto:carlaiochims@yahoo.com.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[http://lattes.cnpq.br/4980343694090946 &amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Igor Alencar Vellame&amp;lt;/font&amp;gt;]&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, Dr. (Technische Universität Bergakademie Freiberg (Alemanha, 2010)) - Colaborador, UFSC [mailto:igor.alencar@ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font face=&amp;quot;sans-serif&amp;quot;&amp;gt;'''PÓS-DOUTORANDOS'''&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Raquel Silva Thomaz&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font face=&amp;quot;sans-serif&amp;quot;&amp;gt;'''RESPONSÁVEL TÉCNICO PELOS ACELERADORES'''&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;span class=&amp;quot;MsoHyperlink&amp;quot;&amp;gt;&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Agostinho A. Bulla&amp;lt;/font&amp;gt;&amp;lt;/span&amp;gt;&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, Eng. Elétrico [mailto:bulla@if.ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font face=&amp;quot;sans-serif&amp;quot;&amp;gt;'''TÉCNICOS'''&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;span class=&amp;quot;MsoHyperlink&amp;quot;&amp;gt;&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Edison Valério Nunes Junior&amp;lt;/font&amp;gt;&amp;lt;/span&amp;gt;&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, técnico operador dos aceleradores[mailto:miro@if.ufrgs.br ]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;span class=&amp;quot;MsoHyperlink&amp;quot;&amp;gt;&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Leandro Tedesco Rossetto&amp;lt;/font&amp;gt;&amp;lt;/span&amp;gt;&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, técnico operador dos aceleradores [mailto:pborba@if.ufrgs.br]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;span class=&amp;quot;MsoHyperlink&amp;quot;&amp;gt;&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Paulo Kovalick&amp;lt;/font&amp;gt;&amp;lt;/span&amp;gt;&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;, técnico mecânico, responsável pela oficina mecânica [mailto:paulokovalick@hotmail.com]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font face=&amp;quot;sans-serif&amp;quot;&amp;gt;'''BOLSISTAS PREMIUM'''&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Eduardo Ribeiro dos Santos [mailto:dududarth@gmail.com]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;font size=&amp;quot;-1&amp;quot; face=&amp;quot;sans-serif&amp;quot;&amp;gt;Marcelo Cavagnolli [mailto:marceloc.eng@gmail.com]&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Infra-estrutura =&lt;br /&gt;
{| style=&amp;quot;text-align: center; width: 100%&amp;quot; border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;2&amp;quot; cellpadding=&amp;quot;2&amp;quot;&lt;br /&gt;
| [[Image:Tandetron.jpg|center|thumb| Acelerador [[Tandetron]] 3MV]]&lt;br /&gt;
| [[Image:HV500c.jpg|center|thumb| Acelerador [[500]] kV]]&lt;br /&gt;
| [[Image:250.jpg|center|thumb| Acelerador 250 kV]]&lt;br /&gt;
|-&lt;br /&gt;
| [[Image:fornos.jpg|center|thumb| [[Reatores]] e Fornos]]&lt;br /&gt;
| [[Image:optica.jpg|center|thumb| [[Óptica]]]]&lt;br /&gt;
| [[Image:preparo.jpg|center|thumb| Sala limpa e preparação]]&lt;br /&gt;
|-&lt;br /&gt;
| [[Image:oficina.jpg|center|thumb| [[Oficina]] Mecânica]]&lt;br /&gt;
| [[Image:apoio.jpg|center|thumb| Salas de apoio]]&lt;br /&gt;
| [[Image:250.jpg|center|thumb| Acelerador 250 kV]]&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Técnicas  =&lt;br /&gt;
&lt;br /&gt;
'''TÉCNICAS DE ANÁLISE POR FEIXE DE ÍONS'''&lt;br /&gt;
&lt;br /&gt;
*Espectrometria de Retroespalhamento Rutherford ([[RBS]]) e Canalização&lt;br /&gt;
&lt;br /&gt;
*Análise por Reações Nucleares ([[NRA]])&lt;br /&gt;
&lt;br /&gt;
*Análise por Detecção de Recuo Elástico ([[ERDA]])&lt;br /&gt;
&lt;br /&gt;
*Emissão de Raios-X Induzida por Partículas ([[PIXE]])&lt;br /&gt;
&lt;br /&gt;
*Espalhamento de Íons de Energia Média ([[MEIS]])&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
'''IMPLANTAÇÃO E IRRADIAÇÃO IÔNICA'''&lt;br /&gt;
&lt;br /&gt;
=Laboratórios Associados=&lt;br /&gt;
&lt;br /&gt;
*[http://www.if.ufrgs.br/microel Laboratório de Microeletrônica ]&lt;br /&gt;
*[https://plone.ufrgs.br/fqsis Laboratório de Físico-Química de Superfícies e Interfaces Sólidas ]&lt;br /&gt;
*[http://www.if.ufrgs.br/~jonder/LEe.html Laboratório de Espectroscopia de Elétrons]&lt;br /&gt;
&lt;br /&gt;
=Links=&lt;br /&gt;
* [[Logotipo]]&lt;br /&gt;
* [[Aceleradores e feixes de íons]]&lt;br /&gt;
* [[Semicondutores]]&lt;br /&gt;
* [[Nano]]&lt;br /&gt;
* [[Energia]]&lt;br /&gt;
* [[Manual MEIS]]&lt;br /&gt;
* [[Manual PowerMeis]]&lt;br /&gt;
&lt;br /&gt;
= Contato =&lt;br /&gt;
&lt;br /&gt;
Laboratório de Implantação Iônica - Instituto de Física - Universidade Federal do Rio Grande do Sul – UFRGS&lt;br /&gt;
 &lt;br /&gt;
Av. Bento Gonçalves, 9500&lt;br /&gt;
91501-970 Porto Alegre, RS&lt;br /&gt;
Brasil&lt;br /&gt;
Fone: +55-51 3308-7004 &lt;br /&gt;
Fax: +55-51 3308-7286&lt;br /&gt;
&lt;br /&gt;
= Ajuda =&lt;br /&gt;
&lt;br /&gt;
* [http://www.mediawiki.org/wiki/Manual:Configuration_settings Lista de opções de configuração]&lt;br /&gt;
* [http://www.mediawiki.org/wiki/Manual:FAQ FAQ do MediaWiki]&lt;br /&gt;
* [https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce Lista de discussão com avisos de novas versões do MediaWiki]&lt;/div&gt;</summary>
		<author><name>Gjfeller</name></author>
	</entry>
</feed>