Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added linear regression #332

Merged
merged 3 commits into from
Sep 30, 2018
Merged

Conversation

JakubSroka
Copy link
Contributor

@codecov-io
Copy link

codecov-io commented Sep 19, 2018

Codecov Report

Merging #332 into master will increase coverage by 1.37%.
The diff coverage is 95.83%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #332      +/-   ##
==========================================
+ Coverage   94.95%   96.33%   +1.37%     
==========================================
  Files          55       57       +2     
  Lines         991     1009      +18     
  Branches       10       10              
==========================================
+ Hits          941      972      +31     
+ Misses         50       37      -13
Impacted Files Coverage Δ
...a/frameless/ml/internals/LinearInputsChecker.scala 100% <100%> (ø)
...rameless/ml/regression/TypedLinearRegression.scala 93.75% <93.75%> (ø)
...main/scala/frameless/ops/RelationalGroupsOps.scala 79.16% <0%> (-18.46%) ⬇️
...la/frameless/functions/NonAggregateFunctions.scala 100% <0%> (ø) ⬆️
.../frameless/CatalystNumericWithJavaBigDecimal.scala 100% <0%> (ø) ⬆️
...aset/src/main/scala/frameless/ops/GroupByOps.scala 98.36% <0%> (+31.69%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 30b974a...2775be3. Read the comment docs.

def setEpsilon(value: Double): TypedLinearRegression[Inputs] = copy(lr.setEpsilon(value))

private def copy(newRf: LinearRegression): TypedLinearRegression[Inputs] =
new TypedLinearRegression[Inputs](newRf, labelCol, featuresCol, weightCol)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rf?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

changed to lr

pDs.select(pDs.col('a), pDs.col('b)).collect.run() == Seq(x2.a -> x2.b)
}
val prop2 = forAll { x2: X2[Vector, Double] =>
val rf = TypedLinearRegression[X2[Vector, Double]]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rf? -.-

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

changed to lr

)

val ds2 = Seq(
X3(new DenseVector(Array(1.0)): Vector,2: Float, 1.0),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can just use 2F instead of 2: Float

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

model.transformer.getAggregationDepth == 10 &&
model.transformer.getElasticNetParam == 0.5 &&
model.transformer.getEpsilon == 4.0 &&
model.transformer.getFitIntercept == true &&
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could skip == true

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

model.transformer.getLoss == lossStrategy.sparkValue &&
model.transformer.getMaxIter == 23 &&
model.transformer.getRegParam == 1.2 &&
model.transformer.getStandardization == true &&
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as above, == true is redundant

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

implicit val arbLossStrategy: Arbitrary[LossStrategy] = Arbitrary {
Gen.oneOf(
Gen.const(LossStrategy.SquaredError),
Gen.const(LossStrategy.SquaredError)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LossStrategy.Huber

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure

@@ -0,0 +1,50 @@
package frameless.ml.regression
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

package frameless
package ml
package regression

Copy link
Contributor

@imarios imarios Sep 23, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same for all files

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure as you wish

@OlivierBlanvillain
Copy link
Contributor

Thanks a lot for the PR! Merging 🎉

@OlivierBlanvillain OlivierBlanvillain merged commit 542c0c8 into typelevel:master Sep 30, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants