Your Restfull APIs with zero coding in less than 30 sec... (see www.restlastic.com)
Your Restful APIs with zero coding in less than 30 seconds...
I spend so many time googling for a easy way to have a free full feature REST API without finding what i want, so i decided to create restlastic to get the ideas from json-server, the modularity of sails.js, the scalability and features of elasticsearch and kibana.
See the 5min video presentation
GET /restlastic?price=free{ features : { create : "Your Restful APIs in less than 30 sec", code : "0 line of code", enjoy : "Pagination, search, filters, fulltext,...", generate : "Instantly swagger, postman, mocha", dashboard : "Realtime graph with kibana", scalable : "To infinity and beyond", customise : "based on node.js, sails.js, elasticsearch" }, tagline: "You Know, for API"}Create a products API (/sample/v1/products)
curl -iX POST \ -d '{ "name":"banana", "status":"available", "price":12 }' \ 'http://api.restlastic.com/sample/v1/products'Now you got
Also when doing requests, its good to know that
{"name": "Foobar"})id value in the body that differ from id in url will send an errorYou can install Restlastic with docker-compose, npm (if elasticsearch already installed) or use cloud demo beta
It will install restlastic, elasticsearch and kibana
$> git clone https://github.com/restlastic/restlastic.git$> cd restlastic$> docker-compose up -dThen use http://localhost (api), http://localhost:9200 (elasticsearch), http://localhost:5601 (kibana)
You must have elasticsearch already installed
$> git clone https://github.com/restlastic/restlastic.git$> cd restlastic$> npm install$> npm startThen use http://localhost:1337
/!\ All data are deleted each day
Use api.restlastic.com, elastic.restlastic.com, kibana.restlastic.com
Based on the previous POST command, here are all the default routes. You can also add your own routes using sails.js routes.
GET sample/v1/products GET sample/v1/products/search?q= GET sample/v1/products/:id POST sample/v1/products POST sample/v1/products/:id PATCH sample/v1/products/:id PUT sample/v1/products/:id DELETE sample/v1/products/:id GET sample/v1/products/swagger.yaml GET sample/v1/products/postman.jsonAdd fields= to retrieve only some fields (Use . to access deep properties)
GET sample/v1/products/:id?fields=id,name,creation_date GET sample/v1/products?fields=id,name GET sample/v1/products?fields=adress GET sample/v1/products?fields=adress.localityUse . to access deep properties
GET sample/v1/products?title=elastifull&author=clodioGET sample/v1/products?id=1,2GET sample/v1/products?id=1&id=2GET sample/v1/products?author.name=clodioAdd start_index and/or count (pagination and total_results will be included in response with next and previous link)
GET sample/v1/products?start_index=20GET sample/v1/products?start_index=20&count=3 { "data":[...], "paging": { "total_results":30, "prev": "http://api.restlastic.com/sample/v1/products?start_index=17&count=3", "next": "http://api.restlastic.com/sample/v1/products?start_index=23&count=3", } }Add _gte,_lte,_gt,_lt for getting a range
GET sample/v1/products?price_gte=10&price_lte=20GET sample/v1/products?creation_date_gte=now-1d/dGET sample/v1/products?creation_date_gte=now-1dGET sample/v1/products?creation_date_gte=2014-06-18T23:59:59ZAdd _ne to exclude a value
GET sample/v1/products?price_ne=20add _exists=true to find records with the field, _exists=false to find records without fields
GET sample/v1/products?address.locality_exists=trueGET sample/v1/products?address.locality_exists=falseAdd _like to filter using like
GET sample/v1/products?title_like=server*Add _prefix to filter with prefix
GET sample/v1/products?name_prefix=clAdd _regex to filter with RegExp
GET sample/v1/products?name_regex=clo.?dioAdd _fuzzy to filter with fuzzy search
GET sample/v1/products?votes_fuzzy=2 --> votes : 1,2,3GET sample/v1/products?name_fuzzy=cladio --> name : clodioGET sample/v1/products?votes_fuzzy=cldio --> name : clodioAdd /search?q= to do a full search on all fields
GET sample/v1/products/search?q=*cRessources can be linked to others with a _, in this case subRessource will have a products_id field with the value :id
GET sample/v1/products/:id/_subRessourceGET sample/v1/products/:id/_subRessource/search?q=GET sample/v1/products/:id/_subRessource/:subRessource_idPOST sample/v1/products/:id/_subRessourcePOST sample/v1/products/:id/_subRessource/:subRessource_idPATCH sample/v1/products/:id/_subRessource/:subRessource_idPUT sample/v1/products/:id/_subRessource/:subRessource_idDELETE sample/v1/products/:id/_subRessource/:subRessource_idcreation_date and modification_date are automatically managed with RFC3369
You can use now, now-1d, now+1d/d,... when searching/filtering (see elastic.co)
GET sample/v1/products?creation_date_gte=now-1dAll ressources have a version, you can use it to cache data or to modify a specific version
GET sample/v1/products/1Headers : etag: 123456Body: { "id":"1", "name":"banana", "etag":"123456" }--> send a etag header and a etag inside the result
you can use header If-None-Match to retrieve data only if modified, to modify data only if not modified
You can create a new ressource or a new field easily by calling POST (ex : /sample/v1/products). It will create an index in elasticsearch named sample_products. If you want to manage precisely the types of your fields, you must use [elasticsearch mapping(https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping.html). By default, all the fields will considered as string, interger and dates depending on the first POST. If you want to change a type or delete a field you must delete the index before changing the mapping.
curl -XGET http://localhost:9200/sample_products/_mapping/productscurl -XPUT 'http://localhost:9200/sample_products/'curl -XPUT 'http://localhost:9200/sample_products/_mapping/products' \-d'{ "products": { "properties": { "creation_date":{ "type":"date","format":"strict_date_optional_time||epoch_millis" }, "etag":{ "type":"long" }, "id":{ "type":"string" }, "modification_date":{ "type":"date","format":"strict_date_optional_time||epoch_millis" }, "name":{ "type":"string" }, "price":{ "type":"long" }, "status":{ "type":"string" } } }}'You can get a swagger template of your Restull API
Open directly in swagger editor http://editor.swagger.io/#/?import=http://api.restlastic.com/sample/v1/products/swagger.yaml&no-proxy
or download the file
GET http://api.restlastic/sample/v1/products/swagger.yamlYou can get a postman template of the Restull API to import in Postman
GET http://api.restlastic.com/sample/v1/products/postman.jsonSince data are stored in ELK you can make graphs with kibana, and use specific elastic request (aggregations,...). to use data you must configure an index pattern corresponding to your data (/sampple/v1/products will have sample_products index)
see kibana for more information
You can easily create data programmatically.
//import-data.js var request = ; var dns="http://localhost:1337/sample/v1/users/"; var body={}; // Create 10 users for var i = 0; i < 10; i++ body = id: i name: 'user_' + i ; request; // Then launch $> node import-data.js Tip use modules like faker, casual or chance to create random semantic data.
You can add your own routes using sails.js routes in /config/routes.js file.
MIT - Restlastic