costapan.webpages.auth.gr
Open in
urlscan Pro
155.207.1.6
Public Scan
URL:
https://costapan.webpages.auth.gr/
Submission: On April 28 via automatic, source certstream-suspicious — Scanned from DE
Submission: On April 28 via automatic, source certstream-suspicious — Scanned from DE
Form analysis
0 forms found in the DOMText Content
CONSTANTINOS PANAGIOTAKOPOULOS I am a Professor at the School of Technology of the Aristotle University of Thessaloniki. Papers concerning my research in Theoretical High Energy Physics and Cosmology can be found here . My research in Machine Learning is the following. -------------------------------------------------------------------------------- Margin Perceptron with Unlearning Margin Perceptron with Unlearning Ref: Panagiotakopoulos, C., Tsampouka, P.:The Margin Perceptron with Unlearning. ICML (2010) 855-862 Solving for hard margin (L2-soft margin) Download source code v 1.1 Feb 2013 exe Solving for L1-soft margin Download source code v 1.1 Feb 2013 Single precision source code v 1.1 July 2013 exe exe Perceptron with Dynamic Margin Ref: Panagiotakopoulos, C., Tsampouka, P.: The Perceptron with Dynamic Margin. ALT (2011) 204-218 Download source code v 1.1 Feb 2013 exe Margitron Ref: Panagiotakopoulos, C., Tsampouka, P.:The Margitron: A Generalized Perceptron with Margin. IEEE Transactions on Neural Networks 22(3) (2011) 395-407 Download source code v 1.1 Feb 2013 exe Perceptron with Weight Shrinking Ref: Panagiotakopoulos, C., Tsampouka, P.: The Role of Weight Shrinking in Large Margin Perceptron Learning. arXiv:1205.4698 (2012) Perceptron with Constant Shrinking Download source code v 1.0 Feb 2013 exe Perceptron with Variable Shrinking (with n=3) Download source code v 1.0 Feb 2013 exe Stochastic Gradient Descent Ref: Panagiotakopoulos, C., Tsampouka, P.: The Stochastic Gradient Descent for the Primal L1-SVM Optimization Revisited. arXiv:1304.6383 (2013) (accepted at ECML/PKDD 2013) SGD-r Stochastic gradient descent with random selection of examples Download source code v 1.0 Apr 2013 Single precision source code v 1.0 July 2013 exe exe SGD Stochastic gradient descent with single (l=1) or mixed single-multiple (l >1) updates and relative accuracy epsilon Download source code v 1.0 Apr 2013 Single precision source code v 1.0 July 2013 exe exe Perceptron with Dynamic Margin Margitron Perceptron with Weight Shrinking Stochastic Gradient Descent Remark In the above programs the seed of the random number generator was fixed to 0 which was the default value of previous Cygwin releases. Instructions The programs compile with the g++ compiler. In order to make the .exe under Cygwin type the command: $ g++ -Wall -lm -O3 file.cc -o train To extend the maximum amount of allocatable memory set the desirable size in the .exe file. E.g., for a size of 1024 MB the command is $ peflags --cygwin-heap=1024 train.exe For the .exe files given the heap size was set to 1024. To run the .exe files on Windows platform one needs cygwin1.dll which comes with the Cygwin setup. To see the available inputs for each program write $ ./train To run the program write $ ./train [inputs] datafile modelfile The datafile should be given in SVM-Light format. This means that each example takes up one line. The label is from the set {-1,+1} and comes first. Then only the attributes with non-zero value should be provided separated from their value by the character ':' A typical line reads like this: -1 1:2.11 3:4.01 7:9.0 15:2.5 If the user doesn't provide a name for a modelfile one will be created in the form datafile.model. The modelfile contains the components of the produced augmented weight vector. Especially, for the perceptron with unlearning solving for L1-soft margin the weight vector a is divided by b since w=a/b . Always the first component corresponds to the extra feature rho of the augmented patterns. Contact For any question regarding the papers or the programs feel free to contact either of the authors. emails: costapan@eng.auth.gr petroula@auth.gr