Skip to content

Commit f189647

Browse files
committed
updating docs.
1 parent 1ca9fc5 commit f189647

File tree

2 files changed

+25
-24
lines changed

2 files changed

+25
-24
lines changed

lbjava/doc/20NEWSGROUP.md

Lines changed: 24 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -374,7 +374,7 @@ Referring once again to this
374374
we first examine our
375375
chosen directory structure starting from the root directory of the distribution.
376376

377-
```java
377+
```
378378
$ ls
379379
20news.LBJava class LBJava test.sh
380380
README data src train.sh
@@ -556,28 +556,28 @@ Alternatively, we can call `TestDiscrete` from within our Java application. This
556556
if our parser’s constructor isn’t so simple, or when we’d like to do further processing with the
557557
performance numbers themselves. The simplest way to do so is to pass instances of our classifier,
558558
labeler, and parser to `TestDiscrete`, like this:
559+
560+
```java
561+
NewsgroupLabel oracle = new NewsgroupLabel();
562+
Parser parser = new NewsgroupParser("data/20news.test");
563+
TestDiscrete tester = TestDiscrete.testDiscrete(classifier, oracle, parser);
564+
tester.printPerformance(System.out);
565+
```
559566

560-
```java
561-
NewsgroupLabel oracle = new NewsgroupLabel();
562-
Parser parser = new NewsgroupParser("data/20news.test");
563-
TestDiscrete tester = TestDiscrete.testDiscrete(classifier, oracle, parser);
564-
tester.printPerformance(System.out);
565-
```
566-
567-
This Java code does exactly the same thing as the command line above. We can also
568-
exert more fine grained control over the computed statistics. Starting from a new instance of
569-
`TestDiscrete`, we can call `reportPrediction(String,String)` every time we acquire both a
570-
prediction value and a label. Then we can either call the `printPerformance(PrintStream)`
571-
method to produce the standard output in table form or any of the methods whose names start
572-
with `get` to retrieve individual statistics. The example code below retrieves the overall precision,
573-
recall, F1, and accuracy measures in an array.
567+
This Java code does exactly the same thing as the command line above. We can also
568+
exert more fine grained control over the computed statistics. Starting from a new instance of
569+
`TestDiscrete`, we can call `reportPrediction(String,String)` every time we acquire both a
570+
prediction value and a label. Then we can either call the `printPerformance(PrintStream)`
571+
method to produce the standard output in table form or any of the methods whose names start
572+
with `get` to retrieve individual statistics. The example code below retrieves the overall precision,
573+
recall, F1, and accuracy measures in an array.
574574

575-
```java
576-
TestDiscrete tester = new TestDiscrete();
577-
...
578-
tester.reportPrediction(classifier.discreteValue(ngPost),
579-
oracle.discreteValue(ngPost));
580-
...
581-
double[] performance = tester.getOverallStats();
582-
System.out.println("Overall Accuracy: " + performance[3]);
583-
```
575+
```java
576+
TestDiscrete tester = new TestDiscrete();
577+
...
578+
tester.reportPrediction(classifier.discreteValue(ngPost),
579+
oracle.discreteValue(ngPost));
580+
...
581+
double[] performance = tester.getOverallStats();
582+
System.out.println("Overall Accuracy: " + performance[3]);
583+
```

lbjava/doc/REGRESSION.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -103,6 +103,7 @@ real MyLabel(MyData d) <- {
103103
return d.getLabel();
104104
}
105105
```
106+
106107
#### 6.2.3 Classifier
107108

108109
Since we are using a classifier with real output type, we need to choose a training method compatible this output type. In this example we use Stochastic Gradient Descent. (visit [Training Algorithms](ALGORITHMS.md) for complete list of training algorithms with the expected output types.)

0 commit comments

Comments
 (0)