Bhubaneswar, Odisha, India
+91-8328865778
support@softchief.com

Blog

Solved: Hive work directory creation issue

  Exception: smartechie:~ sudhir.pradhan$ hive hiveSLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/usr/local/Cellar/hive/2.3.1/libexec/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/usr/local/Cellar/hadoop/2.8.0/libexec/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Logging initialized using configuration in jar:file:/usr/local/Cellar/hive/2.3.1/libexec/lib/hive-common-2.3.1.jar!/hive-log4j2.properties Async: trueException in thread “main” java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D at org.apache.hadoop.fs.Path.initialize(Path.java:254) at org.apache.hadoop.fs.Path.<init>(Path.java:212) at…
Read more

Unzip Multiple Files from Linux Command Line

Problem : [hadoop@spradhan]$ unzip *.zip Archive: a.csv.zip caution: filename not matched: b.csv.zip caution: filename not matched: c.csv.zip caution: filename not matched: d.csv.zip caution: filename not matched: e.csv.zip Solution : [hadoop@spradhan]$ unzip ‘*.zip’ If you run in background, [hadoop@spradhan]$ nohup unzip ‘*.zip’ &

Copy file or folder from amazon S3 to EC2

Install aws cli in the ec2 instance [if not installed] $ sudo yum install aws-cli Configure aws cli $ aws configure AWS Access Key ID [None]: <your_access_key> AWS Secret Access Key [None]:<your_secret_key> Default region name [None]: Default output format [None]: Execute sync command in ec2 instance aws s3 sync s3://<path_to_file> <ec2_local_path>

Hive : Unable to start metastore issue

Exception : smartechie:confluent-3.3.0 sudhir.pradhan$ hive SLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/usr/local/Cellar/hive/2.1.0/libexec/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/usr/local/Cellar/hadoop/2.8.0/libexec/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Logging initialized using configuration in jar:file:/usr/local/Cellar/hive/2.1.0/libexec/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: trueException in thread “main” java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:578) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)…
Read more

Hive: Unable to instantiate Metastore

Exception : 2017-12-28T15:05:52,943 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore – 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore2017-12-28T15:05:52,943 INFO [main] org.apache.hadoop.hive.metastore.HiveMetaStore – 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStoreMetaException(message:Version information not found in metastore. ) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:83) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6883) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6878) at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:7136) at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:7063) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:234) at…
Read more

Most Used FetchXML Queries in Dynamics 365 CRM

FetchXML is a XML based query language used in Microsoft Dynamics 365 CRM to fetch data. FetchXML is capable of doing many things as explained below. Can only be used to retrieve data not able to perform CUD (Create/Update/Delete) operation. Can be used in JavaScript to retrieve data in client side also used in server…
Read more

7 Popular Articles of 2017 in Dynamics 365

Wish you all a very special Happy New Year 2018. Here is a list of Top 10 articles for you which are very popular in 2017 year. Please go through the below articles. 1 101 most used Dynamics 365 CRM code snippets. 2 13 important Technologies for CRM Developers. 3 11 Top websites for Dynamics…
Read more

Presto DB : BIGINT or LONG to TIMESTAMP

Timestamp stored in the hive column UPDT_DT in the table like, $ presto-cli –catalog hive –schema default presto:default> select updt_dt from HIVE_SRP_TEST_TBL limit 5;     updt_dt     ————— 1497961733000 1497961733000 1497961733000 1497961733000 1497961733000 (5 rows) ISSUE : When you simple convert to timestamp, the output would be like, presto:default> select from_unixtime(updt_dt)updt_dt from HIVE_SRP_TEST_TBL limit…
Read more

Confluent Kafka Hdfs Sink With Hive Integration

Exception : [2017-11-10 08:32:32,183] ERROR Task hdfs-sink-prqt-stndln-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerSinkTask:455) java.lang.RuntimeException: java.util.concurrent.ExecutionException: io.confluent.connect.hdfs.errors.HiveMetaStoreException: Invalid partition for default.srp-oracle-jdbc-stdln-raw-KFK_SRP_HDFS_SINK_TEST: partition=0 at io.confluent.connect.hdfs.DataWriter.write(DataWriter.java:226) at io.confluent.connect.hdfs.HdfsSinkTask.put(HdfsSinkTask.java:103) at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:435) at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:251) at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:180) at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:148) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.util.concurrent.ExecutionException: io.confluent.connect.hdfs.errors.HiveMetaStoreException: Invalid partition for…
Read more

Solved: Running Hive as ec2-user access denied

Exception : [ec2-user@ip-123-45-67-890 ~]$ hive Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j2.properties Async: falseException in thread “main” java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=ec2-user, access=WRITE, inode=”/user/ec2-user”:hdfs:hadoop:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:320) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1728) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1712) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1695) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)…
Read more

Install GRADLE in Amazon Linux

  #!/bin/sh gradle_version=”4.4-rc-5″ #Your Custom Installation Path install_target_path=”/opt/gradle” wget -N https://services.gradle.org/distributions/gradle-${gradle_version}-all.zip mkdir -p ${install_target_path} unzip gradle-${gradle_version}-all.zip mv gradle-${gradle_version} ${install_target_path}/ ln -sfn gradle-${gradle_version} ${install_target_path}/latest chown -R ec2-user:ec2-user ${install_target_path} printf “export GRADLE_HOME=${install_target_path}/latest\nexport PATH=\$PATH:\$GRADLE_HOME/bin” > /etc/profile.d/gradle.sh . /etc/profile.d/gradle.sh # check installation gradle -v

Solved: Connection issue to DB from Amazon Linux EC2 instance

    [ec2-user@ip-xxx-xx-xx-xx ~]$ sudo vi /etc/docker/daemon.json [ec2-user@ip-xxx-xx-xx-xx ~]$ sudo vi /etc/docker/daemon.json [ec2-user@ip-xxx-xx-xx-xx ~]$ sudo cat /etc/docker/daemon.json { “bip”: “yyy.yyy.y.y/zz” } [ec2-user@ip-xxx-xx-xx-xx ~]$ sudo service docker stopStopping docker:                                           [  OK  ] [ec2-user@ip-xxx-xx-xx-xx ~]$ sudo…
Read more

Workflow and Dialog Process Stages and Life Cycle

The life cycle of a process describes the state transitions from creation through execution and Termination. A process can have following states: Draft, Ready, Suspended, Locked, and Completed. The events that occur throughout the lifetime of the process cause a transition from one state to another. Workflows The workflow life cycle and State Transition is…
Read more

101 Most Used Dynamics 365 CRM codes

Here is a list of most used code snippet used in Dynamics 365 CRM.Remember to add required namespace whenever required while inserting code. Keep the CRM SDK folder ready to take the reference of the assemblies for below namespace. Most frequently used namespaces are given below, [code lang=”php”] using System; using System.Configuration; using System.ServiceModel; //…
Read more

9 Top Tools used by Dynamics 365 CRM Developers

Microsoft Dynamics 365 CRM developers use various software tools to make the day-to-day productivity more and more. Also these are the ways by which developers can customize the complex CM platform. Below is a list of tools which are being used by dynamics CRM developers in day-to-day project works. These are listed below. Bonus Tool…
Read more

13 Essential Technologies for CRM Developers

Dynamics 365 CRM is a highly adopted platform for managing customer data efficiently used by starting from small-scale, medium scale industry to large-scale business organisations. The demand of consultants are also increasing highly for managing the technical and functional aspects of Dynamics 365 CRM. Although the demand of Technical, Functional and Techno-functional consultants increasing at…
Read more

Installing Maven using Yum on EC2 instance (Amazon Linux)

Install Maven : Following are the set of commands need to be executed sequentially to install maven. sudo wget http://repos.fedorapeople.org/repos/dchen/apache-maven/epel-apache-maven.repo -O /etc/yum.repos.d/epel-apache-maven.repo sudo sed -i s/\$releasever/6/g /etc/yum.repos.d/epel-apache-maven.repo sudo yum install -y apache-maven mvn –version And you all set to run any “mvn” (maven) command in ec2 instance. Output :

Install/upgrage Java 8 using Yum on EC2 instance (Amazon Linux AMI)

Step-1: Install Java Runtime – java 1.8 sudo yum install java-1.8.0 if you need a java compiler and other developer tools: sudo yum install java-1.8.0-openjdk-devel Step-2: If you have multiple versions and one of those default , use the alternatives command as follows and enter the selection number as guided in the terminal. sudo /usr/sbin/alternatives –config…
Read more

11 Top Websites for great Developers

Everybody dreams about being a great developer in Dynamics 365 CRM. Here I have compiled a list of  websites with excellent resources by which you can be a master in developing the Dynamics CRM applications. Microsoft Dynamics 365 Developer Center This website is the official website of Microsoft to get the information you need to…
Read more

Step By Step : Installing Kafka in Mac

Open “Terminal” app from Applications or Command + Space and then type “Terminal” press Enter/Return Key Install Homebrew (Copy / Paste the following command in the Terminal window and press enter) ruby -e “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)” < /dev/null 2> /dev/null Now install Kafka(Copy / Paste the following command in the Terminal window and press enter)…
Read more

Most Viewed Video Tutorials of Softchief

This post presents a list of most viewed video tutorials of soft chief published on YouTube channel official. Lets take a look of th tutorials and learn. This is Free Viewing tutorials. For Complete video tutorial please visit the channel HERE.   Please subscribe the YouTube Channel.

Step By Step: Getting Kafka installed in Mac OS X Sierra

In this activity we are going to use the beautiful packaging manager tool Homebrew throughout the installation process. This tool make life easier to install and manage the latest version of the software and keep updated. Step 1 : Install Homebrew (as an administrator) $ /usr/bin/ruby -e “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)” The above command will install the following…
Read more

Problem installing ZooKeeper & Kafka in macOS Sierra

Until I could successfully installed Kafka in my macOS Sierra environment, the problem I was facing is bringing the zookeeper up & running. We can start Kafka server only if zookeeper server is up. Downloaded the zookeeper trunk build from git repo and extracted in local When started the zookeeper first time from the extracted build,…
Read more

Step By Step : Build and Run Kafka in Eclipse IDE + [ Scala || Java ] + Gradle

As a coder, we are more comfortable with the editor tools (specially Eclipse IDE) for rapid development , build & continuous integration. When first time I was trying to develop some Kafka producer and consumer using Scala, I was wondering if I could setup the same through eclipse to make life easier, however after a…
Read more

REST API Maven Dependency : Jersey + Jetty

Hooooh …. finally !!! After devoting ample number of hours with lots of R&Ds, finally able to resolve the long running dependencies between Jersey and Jetty when developing a REST API. I am posting the article because working and running the REST API with TOMCAT or similar server would be easy but might not with…
Read more

Pass parameters to HTML Web resource

HTML web resource page can only accept a single custom parameter called data. To pass more than one value in the data parameter, you need to encode the parameters and decode the parameters in our page. The example can be found in the SDK folder that can be downloaded freely. The path is sdk\samplecode\js\webresources\showdataparams.htm First we have…
Read more