Sunday 26 August 2018

return list of json spring boot rest controller

Let Say we have list of CarDetails Pojo and we want to return them back
@RestController
public class CarDetailController {
  @GetMapping("/viewAllCarDetailList")
    public List<CarDetail> retrieveAllCarDetails() {
        List<CarDetail> contacts = new ArrayList<CarDetail>();

        CarDetail objt = new CarDetail();
        objt.setCarModel("hyundai");
        objt.setSubModel("I10");
        CarDetail objt2 = new CarDetail();
        objt2.setCarModel("hyundai");
        objt2.setSubModel("I20");        
        contacts.add(objt);
        contacts.add(objt2);
        return contacts;
    }
}
    public class CarDetails {

            private String carModel;
            private String subModel;
// Will haave Setter getter and hash code equls method
//and constructor
    }
This JSON will be output:-
[
    {
        "carModel": "hyundai",
        "subModel": "I10"
    },
    {
        "carModel": "hyundai",
        "subModel": "I20"
    }
]

output in Postman

https://stackoverflow.com/questions/41719142/how-to-return-a-set-of-objects-with-spring-boot


Saturday 18 August 2018

scala object hello world eclipse example

scala hello world eclipse

object HelloWorldScala {
  def main(args: Array[String]) {
    println("Hello Scala !!")
  }
}

output will be as like bellow

Hello Scala !!


spark-shell java was unexpected at this time

spark-shell java was unexpected at this time

You can see this issue when your Java home set like following in windows

JAVA_HOME=C:\Program Files (x86)\Java\jdk1.8.0_162\bin
Issue here is space in "Program Files (x86)" If you add double quotes will not work on window 10

"C:\Program Files (x86)\Java\jdk1.8.0_162\bin"
You need to copy Java into outside Program Files (x86) then it should work

JAVA_HOME=C:\java\jdk1.8.0_171\bin
--------------------------------------------------------------------------------------------------
Finally I upgraded my java version to JDK 1.8 for Spark!!!



spark shell cmd not recognized as an internal or external command

spark shell cmd not recognized as an internal or external command


I fixed it after following changes:

There were multiple Java/bin path in the System Path.
 So I corrected them to reflect single Java/Bin, which is in sync with JAVA_HOME

Added C:Windows\system32 to System Path Variable.