You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is a Gitter chat room which used to be used, but hasn't been used much recently (I notice the link to it has vanished from the README for some reason).
Check the standard output (console output), if Baleen fails to load an OpenNLP model it will throw an Exception. If there is no Exception thrown, it means it successfully loaded the model.
If you are using the OpenNLP NER models, remember to add the "language.OpenNLP" annotator to the pipeline, preferably before the "stats.OpenNLP" one. For example, see below.
language.OpenNLP
class: stats.OpenNLP
model: models/en-ner-person.bin
type: Person
Hi all,
So sorry for opening this as an issue, but despite endless Googling, I can;t find anywhere to communicate with developers or Users of Baleen.
Is there a channel or forum anywhere?
I've setup a basic pipeline (using the html5 consumer) but despite using the OpenNLP Annotator, there are no Spans being added to the HMTL output.
Just seeking advice from others.
Many thanks
The text was updated successfully, but these errors were encountered: