X-MDAV-Processed: bstu.by, Mon, 01 Mar 2010 22:03:01 +0200 X-Spam-Processed: bstu.by, Mon, 01 Mar 2010 22:02:47 +0200 Return-path: X-Spam-Checker-Version: SpamAssassin 3.2.4 (2008-01-01) X-Spam-Level: X-Spam-Status: No, score=-1.5 required=20.0 tests=BAYES_00,FH_DATE_PAST_20XX, HTML_MESSAGE autolearn=no version=3.2.4 X-Spam-Report: * 3.2 FH_DATE_PAST_20XX The date is grossly in the future. * -4.7 BAYES_00 BODY: Bayesian spam probability is 0 to 1% * [score: 0.0000] * 0.0 HTML_MESSAGE BODY: HTML included in message Received: from mail-yw0-f172.google.com (mail-yw0-f172.google.com [209.85.211.172]) by bstu.by (bstu.by [82.209.198.20]) (MDaemon PRO v10.0.0) with ESMTP id md50000710357.msg for ; Mon, 01 Mar 2010 22:02:45 +0200 Authentication-Results: bstu.by x-ip-ptr=pass dns.ptr=mail-yw0-f172.google.com (ip=209.85.211.172); x-ip-helo=pass smtp.helo=mail-yw0-f172.google.com (ip=209.85.211.172); x-ip-mail=hardfail smtp.mail=akira-i+caf_=akira=bstu.by@brest-state-tech-univ.org (does not match 209.85.211.172) X-MDOP-RefID: str=0001.0A0B0206.4B8C1D5F.00CA,ss=1,fgs=0 (_st=1 _vt=0 _iwf=0) X-Rcpt-To: akira@bstu.by X-MDRcpt-To: akira@bstu.by X-MDRemoteIP: 209.85.211.172 X-Return-Path: akira-i+caf_=akira=bstu.by@brest-state-tech-univ.org X-Envelope-From: akira-i+caf_=akira=bstu.by@brest-state-tech-univ.org X-MDaemon-Deliver-To: akira@bstu.by Received: by ywh2 with SMTP id 2so1550952ywh.24 for ; Mon, 01 Mar 2010 12:00:29 -0800 (PST) Received: by 10.101.10.15 with SMTP id n15mr6990943ani.114.1267473629800; Mon, 01 Mar 2010 12:00:29 -0800 (PST) X-Forwarded-To: akira@bstu.by X-Forwarded-For: akira-i@brest-state-tech-univ.org akira@bstu.by Delivered-To: akira-i@brest-state-tech-univ.org Received: by 10.100.208.6 with SMTP id f6cs126438ang; Mon, 1 Mar 2010 12:00:28 -0800 (PST) Received: by 10.101.6.30 with SMTP id j30mr6936300ani.200.1267473627761; Mon, 01 Mar 2010 12:00:27 -0800 (PST) Received: from web31002.mail.mud.yahoo.com (web31002.mail.mud.yahoo.com [68.142.200.165]) by mx.google.com with SMTP id 1si10274112gxk.29.2010.03.01.12.00.26; Mon, 01 Mar 2010 12:00:26 -0800 (PST) Received-SPF: pass (google.com: domain of nistor_grozavu@yahoo.com designates 68.142.200.165 as permitted sender) client-ip=68.142.200.165; Authentication-Results: mx.google.com; spf=pass (google.com: domain of nistor_grozavu@yahoo.com designates 68.142.200.165 as permitted sender) smtp.mail=nistor_grozavu@yahoo.com; dkim=pass (test mode) header.i=@yahoo.com Received: (qmail 3703 invoked by uid 60001); 1 Mar 2010 20:00:25 -0000 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=yahoo.com; s=s1024; t=1267473625; bh=lyWesgBSqcwrTGGMYzXiKz6/3/kwD64MZiFCfJ80jq4=; h=Message-ID:X-YMail-OSG:Received:X-Mailer:Date:From:Subject:To:MIME-Version:Content-Type; b=X6fUaPjEXIlkNf5bFjgevYhUXwp7zeTGLwFmnKHCPNBnqzJ6MI2TLG1kws11GKotqPSL3rkwR5IEoenCeKwUHbS7WBKL/FokG75r9tmvLCjWSHjthxDNj77T39qXQpAeKqzNhkThr4dmtlRKkBf0bRJbtxlphP2Glb+7QTS6SRA= DomainKey-Signature:a=rsa-sha1; q=dns; c=nofws; s=s1024; d=yahoo.com; h=Message-ID:X-YMail-OSG:Received:X-Mailer:Date:From:Subject:To:MIME-Version:Content-Type; b=gDpHlau3eZj8qcwD702w+vwFdlaAtkR5Pe9oZFEZ7HqchuvpP2qpHiB3XfweR+1yrSH2ATEvdegDq8EfLLOsm+KEQoE63l6WrC40XWAME9oPAqKrGdT8Q3yvOqIS8Cvhw55YYffF+9vF3rV6kp1yXbFEmP53SkPIIcfieFptT6Y=; Message-ID: <126527.2823.qm@web31002.mail.mud.yahoo.com> X-YMail-OSG: DJJp9AkVM1k5.tgZ0WxzmXu4Foz9uj7FkoTsz5jP.YhTIPy 4TJ7yEA41RwWyefGvHkc8f9KOkFSyzAMlO9jDBXQ1goyuZKKEGoVKL0.KY2J 9q_3JTQymnNNcARCFkfRUwAWimIM9StQDgzs_fDSMRfHBZYvuIs4.EQlf_ed 8RGDmzU_S.FGYovVWDKSG6zTeZBkeR7uRSOImlUA1rjGALcXMBUyoHaSWtHj 336yZZLYd24her3LXSh4FbPDMrVSiJDee1wciDpUotnLDBxUN5hWd3c2nXxs F4bPqChWUbDBbOVEUPPsIqfBg49EgRv7s1O8fCLzXoW9YBM1A0cmzCsBZpaA hZD8AhohMnl6r_6BG.SPyXQONLEDsUSlOxg-- Received: from [79.82.31.37] by web31002.mail.mud.yahoo.com via HTTP; Mon, 01 Mar 2010 12:00:25 PST X-Mailer: YahooMailRC/300.3 YahooMailWebService/0.8.100.260964 Date: Mon, 1 Mar 2010 12:00:25 -0800 (PST) From: Nistor Grozavu Subject: CFP: ICNNAI-2010 Special Session: Incremental Topological Learning Models and Dimensional Reduction To: akira-i@brest-state-tech-univ.org MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="0-1332906984-1267473625=:2823" Reply-To: nistor_grozavu@yahoo.com --0-1332906984-1267473625=:2823 Content-Type: text/plain; charset=iso-8859-1 Content-Transfer-Encoding: quoted-printable CFP: ICNNAI-2010 Special Session:=0AIncremental Topological Learning Models= and Dimensional Reduction=0A =0A =0A=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=0ASubmissions Due: = April 4, 2010=0A=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=0A =0A1 - 4 June, 2010=0ABrest State Tech= nical University =0ABelarus=0Ahttp://icnnai.bstu.by/icnnai-2010.html=0A =0A= =0ASCOPE=0A=3D=3D=3D=3D=3D=3D=0AIncremental Learning is a subfield of the = Artificial=0AIntelligence that deals with data flow. The key hypothesis is = that the=0Aalgorithms are able to learn data from a data subset and then to= re-learn with=0Anew unlabeled data. At the end of the learning, one of the= problems is the=0Aclustering analysis and visualization of the results. Th= e topological learning=0Ais one of the most known technique that allows clu= stering and visualization=0Asimultaneously. At the end of the topographic l= earning, "similar'' data=0Awill be collect in clusters, which correspond to= the sets of similar=0Aobservations. These clusters can be represented by m= ore concise information=0Athan the brutal listing of their patterns, such a= s their gravity center or=0Adifferent statistical moments. As expected, thi= s information is easier to=0Amanipulate than the original data points.=0A= =0ADimensionality reduction is another major challenge=0Ain the domain of u= nsupervised learning which deals with the transformation of a=0Ahigh dimens= ional dataset into a low dimensional space, while retaining most of=0Athe u= seful structure in the original data, retaining only relevant features and= =0Aobservations. Dimensionality reduction can be achieved by using a cluste= ring=0Atechnique to reduce the number of observations or a features selecti= on approach=0Ato reduce the features space.=0A =0AThis session would solici= t theoretical and applicative=0Aresearch papers including but not limited t= o the following topics :=0A =0A =0A =B7 Supervised/Uns= upervised Topological Learning; =0A =0A =B7 Self-Organ= ization (based on artificial neural=0Anetworks, but not limited to); =0A = =0A =B7 Clustering Visualization and Analysis; =0A = =0A =B7 Time during the learning process; =0A = =0A =B7 Memory based systems; =0A =0A =B7 User= interaction models; =0A =0A =B7 Fusion (Consensus) ba= sed models; =0A =0A =B7 Clustering; =0A = =0A =B7 Feature selection;=0A =0A =0A =0ASUBMISSION=0A=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=0AThe special session will be held as= =0Aa part of the ICNNAI'2010 conference (The 5th International Conference o= n=0ANeural Network and Artificial Intelligence ) . The authors would submit= papers=0Athrough easychair site : http://www.easychair.org/conferences/?co= nf=3Ditlmdr10. =0A =0AAll paper submissions will be handled electronical= ly. Detailed instructions for=0Asubmitting the papers are provided on the c= onference home page at : =0Ahttp://icnnai.bstu.by/icnnai-2010.html =0A = =0APapers must correspond to the requirements detailed in the instructions = to=0Aauthors from the ICNNAI 2010 web site. Accepted papers must be present= ed by one=0Aof the authors to be published in the conference proceeding. If= you have any=0Aquestions, do not hesitate to direct your questions to nist= or.grozavu@lipn.univ-paris13.fr =0A =0AIMPORTANT DATES=0A=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=0APaper Submission=0ADeadline: 4 April =0ANo= tification of acceptance: 22 April =0ACamera-ready papers: 29 April=0A =0AO= RGANIZERS=0A=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3DNistor GROZAVU,= =0APost-Doc, Computer Science Laboratory of Paris 13 University, FRANCE=0AM= ustapha LEBBAH, Associate Professor at the Paris 13 University, FRANCE=0AYo= un=E8s BENNANI, Full Professor at the Paris 13 University, FRANCE=0A Best r= egards,=0ANistor Grozavu=0APhD, Computer Science Laboratory of the Paris 13= University (LIPN)=0Ahttp://www-lipn.univ-paris13.fr/~grozavu/=0Atel: +33 (= 0)626901790=0A=0A=0A=0A --0-1332906984-1267473625=:2823 Content-Type: text/html; charset=iso-8859-1 Content-Transfer-Encoding: quoted-printable

CFP: ICNNAI-2010 Special Session:=0AIncremental Topo= logical Learning Models and Dimensional Reduction

=0A=0A

 

=0A=0A

 

=0A=0A

=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D

=0A=0A

Submissions Due:  April  4,  2010=

=0A=0A=0A

=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D

=0A=0A

 

=0A=0A

1 - 4 June, 2010
=0ABrest State Technical Univer= sity
=0ABelarus

=0A=0A

http://icnnai.bstu.by/icnnai= -2010.html

=0A=0A=0A

 

=0A=0A

 =0A=0A

= SCOPE
=0A
=3D=3D=3D=3D=3D=3D<= /p>=0A=0A

Incremental Learning is a subfield of the Artificial=0AInte= lligence that deals with data flow. The key hypothesis is that the=0Aalgori= thms are able to learn data from a data subset and then to re-learn with=0A= new unlabeled data. At the end of the learning, one of the problems is the= =0Aclustering analysis and visualization of the results. The topological le= arning=0Ais one of the most known technique that allows clustering and visu= alization=0Asimultaneously. At the end of the topographic learning, "simila= r'' data=0Awill be collect in clusters, which correspond to the sets of sim= ilar=0Aobservations. These clusters can be represented by more concise info= rmation=0Athan the brutal listing of their patterns, such as their gravity = center or=0Adifferent statistical moments. As expected, this information is= easier to=0Amanipulate than the original data points.

=0A=0A


=0ADimensionality reduction is another major challeng= e=0Ain the domain of unsupervised learning which deals with the transformat= ion of a=0Ahigh dimensional dataset into a low dimensional space, while ret= aining most of=0Athe useful structure in the original data, retaining only = relevant features and=0Aobservations. Dimensionality reduction can be achie= ved by using a clustering=0Atechnique to reduce the number of observations = or a features selection approach=0Ato reduce the features space.

=0A=0A

 

=0A=0A

This session would solicit theoretical an= d applicative=0Aresearch papers including but not limited to the following = topics :

=0A=0A

=  
=0A        =           =0A    &nb= sp;=B7 Supervised/Unsupervised Topological Learning;
=0A   &n= bsp;              = =0A     =B7 Self-Organization (based on artificial neur= al=0Anetworks, but not limited to);
=0A       =            =0A  &nbs= p;  =B7 Clustering Visualization and Analysis;
=0A   &nb= sp;              =0A=      =B7 Time during the learning process;
=0A = ;               &nbs= p; =0A     =B7 Memory based systems;
=0A =                 = ; =0A     =B7 User interaction models;
=0A&nbs= p;               &nb= sp; =0A     =B7 Fusion (Consensus) based models; <= br>=0A              =    =0A     =B7 Clustering;
=0A = ;               &nbs= p; =0A     =B7 Feature selection;

=0A=0A=

 

=0A=0A

 

=0A=0A<= p class=3D"MsoNormal" style=3D"text-align: justify;"> 

=0A=0A

SUBMISSION

=0A=0A

=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D

=0A=0A

The special session will be held as=0Aa part of t= he ICNNAI'2010 conference (The 5th International Conference on=0ANeural Net= work and Artificial Intelligence ) . The authors would submit papers=0Athro= ugh easychair site : http://www.easychair.org/conferences/?conf=3Ditlmdr10.
=0A=0A   
=0AAll paper s= ubmissions will be handled electronically. Detailed instructions for=0Asubm= itting the papers are provided on the conference home page at :

= =0Aht= tp://icnnai.bstu.by/icnnai-2010.html
=0A    <= br>=0APapers must correspond to the requirements detailed in the instructio= ns to=0Aauthors from the ICNNAI 2010 web site. Accepted papers must be pres= ented by one=0Aof the authors to be published in the conference proceeding.= If you have any=0Aquestions, do not hesitate to direct your questions to= =0Ani= stor.grozavu@lipn.univ-paris13.fr

=0A=0A

 

=0A= =0A

IMPORTANT DATES
=0A=
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D

=0A=0A

Paper Submi= ssion=0ADeadline: 4 April
=0A=0ANotification of acceptance: 22 April
=0ACame= ra-ready papers: 29 April

=0A=0A

 

=0A=0A

ORGANIZERS
=0A=0A
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D

=0A=0ANistor GROZAVU,=0APost= -Doc, Computer Science Laboratory of Paris 13 University, FRANCE
=0AMust= apha LEBBAH, Associate Professor at the Paris 13 University, FRANCE
=0AY= oun=E8s BENNANI, Full Professor at the Paris 13 University, FRANCE
 
Best regards,
Nistor Grozavu
PhD, Computer Science Lab= oratory of the Paris 13 University (LIPN)
http://www-lipn.univ-pari= s13.fr/~grozavu/
tel: +33 (0)626901790

=0A
=0A=0A=0A=0A --0-1332906984-1267473625=:2823--