Bhubaneswar, Odisha, India
+91-8328865778
support@softchief.com

Blog

Multi-select Optionset DataType in Dynamics 365 Customer Engegement

Previously we have worked on Optionsets which allows us to select an item from a list at one time. But Dynamics 365 introduces a new data type called as “Multi-select Optionset” which allows us to select multiple items at one time. Here we will do some practical sessions to undersnad how this Multi-select optionset works…
Read more

Project Service Automation in Dynamics 365 CE

The Project Service Automation (PSA) module helps organizations efficiently track, manage, and deliver project-based services, from the initial sale all the way to invoicing. Project managers rely on software applications to track and manage the delivery of projects. Typically, the first step after finalizing the project charter is to build the work breakdown structure. The…
Read more

Field Service Module in Dynamics 365 CE

Field Service Module is the basic need for Customer Site Visit Management system. in this module businesses deal with below processes. The Dynamics 365 Field Service business application helps organizations deliver onsite service to customer locations. The application combines workflow automation, scheduling, and mobility to set mobile workers up for success when they’re onsite with customers fixing issues.…
Read more

Customer Services Module Life Cycle in Microsoft Dynamics 365 CE

Customer Service module is one of the crucial module in Dynamics 365 Customer Engagement Business Application. Watch the video below for more practical or scroll the article to read more. The Customer Service Module deals with below entities. Case Service Activities and Calender SLAs Queues knowledge base Entitlements Every organization must have a sort of…
Read more

Marketing Module Life Cycle in Dynamics 365 Customer Engagement

Every organization spend a lot money in the marketing of their business products and services. SO streamlining of the Marketing department and process is must. Dynamics 365 CE provides a very flexible way of handling all Marketing issues of all types of Businesses by providing built-in entities to store marketing data and robust insight for…
Read more

How to Read Dynamics 365 CRM Data in Javascript using Web API with FetchXML

In the below code snippet we can read account records whose name starts with “M” and show all names in alert.

Simple, yet Powerful : Python generic property auto assignment

Folks transitioning from java or any object-oriented language to python might struggling or spending their substantial time when assigning the method argument variables to their class level variable every-time they write reusable classes and functions definitions. For example, Imagine the above function, what if I have to long list of arguments and in multiple functions…
Read more

Tips & Tricks – Spark Streaming and Amazon S3

As we all know, the Amazon S3 is an amazing storage to deal with persisting the hot and cold data in this big-data era. It has 99.99% uptime which has been claimed by amazon. You can follow the documentation from amazon for more details. When we all agreeing upon the S3 storage is easiest, resilient…
Read more

Gradle Build – Copying custom file and replace with tockenized string

Objective The blog is to copy a file from the project path to build path after the project build completes Example Use-case : In my case, I have a Dockerfile in the project root directory. I wanted to copy the file to the build directory once project build completes. Project structure : smartechie-pro | —– src…
Read more

JenkinsFile – the trustAnchors parameter must be non-empty

Error : java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty This is the problem I have faced when running build or test command on any java project. The following is the error I got, + make test ./gradlew test Downloading https://services.gradle.org/distributions/gradle-3.3-all.zip Exception in thread “main” javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty…
Read more

Working with Postgres SQL in macOS

Install / upgrade Postgresql smartechie-macos :~ $ brew postgresql Get installation details smartechie-macos :~ $ brew info postgres Result postgresql: stable 10.4 (bottled), HEAD Object-relational database system https://www.postgresql.org/ Conflicts with: postgres-xc (because postgresql and postgres-xc install the same binaries.) /usr/local/Cellar/postgresql/9.6.3 (3,260 files, 36.6MB) Poured from bottle on 2017-06-05 at 20:47:39 /usr/local/Cellar/postgresql/10.4 (3,389 files, 39.2MB) *…
Read more

Solved : Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try

Caused by: java.io.IOException: Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[172.27.10.191:50010,DS-51d68378-35de-4c70-b27e-7a98a53919cc,DISK], DatanodeInfoWithStorage[172.27.10.223:50010,DS-f172c682-713d-4a8f-b8af-69198ddc6756,DISK]], original=[DatanodeInfoWithStorage[172.27.10.191:50010,DS-51d68378-35de-4c70-b27e-7a98a53919cc,DISK], DatanodeInfoWithStorage[172.27.10.223:50010,DS-f172c682-713d-4a8f-b8af-69198ddc6756,DISK]]). The current failed datanode replacement policy is DEFAULT, and a client may configure this via ‘dfs.client.block.write.replace-datanode-on-failure.policy’ in its configuration. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:925) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:988) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:1156) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:454)…
Read more

Unable to connect to Hbase shell

Problem : smartechies.mac $ hbase shell ArgumentError: wrong number of arguments (0 for 1) method_added at file:/usr/local/Cellar/hbase/1.2.6/libexec/lib/jruby-complete-1.6.8.jar!/builtin/javasupport/core_ext/object.rb:10 method_added at file:/usr/local/Cellar/hbase/1.2.6/libexec/lib/jruby-complete-1.6.8.jar!/builtin/javasupport/core_ext/object.rb:129 Pattern at file:/usr/local/Cellar/hbase/1.2.6/libexec/lib/jruby-complete-1.6.8.jar!/builtin/java/java.util.regex.rb:2 (root) at file:/usr/local/Cellar/hbase/1.2.6/libexec/lib/jruby-complete-1.6.8.jar!/builtin/java/java.util.regex.rb:1 require at org/jruby/RubyKernel.java:1062 (root) at file:/usr/local/Cellar/hbase/1.2.6/libexec/lib/jruby-complete-1.6.8.jar!/builtin/java/java.util.regex.rb:42 (root) at /usr/local/Cellar/hbase/1.2.6/libexec/bin/../bin/hirb.rb:38 Solution :  Point the JAVA_HOME variable to correct value in  conf/hbase-env.sh Example : smartechies.mac $ cat ~/.profile | grep JAVA_HOME export…
Read more

Create Superuser in HUE

[ec2-user@ip-123-45-67-890 ~]$ ls -ltr /usr/lib/hue/build/env/bin/hue -rwxr-xr-x 1 root root 523 Sep 22 22:09 /usr/lib/hue/build/env/bin/hue [ec2-user@ip-123-45-67-890 ~]$ sudo /usr/lib/hue/build/env/bin/hue createsuperuser Username (leave blank to use ‘root’): sudhir Email address: mail2sudhir.online@gmail.com Password: Password (again): Superuser created successfully. [ec2-user@ip-123-45-67-890 ~]$

Solved : error “/usr/bin/env: node: No such file or directory

Error : “/usr/bin/env: node: No such file or directory Solution : ln -s /usr/bin/nodejs /usr/bin/node   Followup issue : Error: EACCES: permission denied, open ‘/home/ubuntu/.config/configstore/bower-github.json’ Solved (click here for solution)

Solved: permission denied, open ‘/home/ubuntu/.config/configstore/bower-github.json’ in ubuntu

Error : Error: EACCES: permission denied, open ‘/home/ubuntu/.config/configstore/bower-github.json’ You don’t have access to this file. at Error (native) at Object.fs.openSync (fs.js:549:18) at Object.fs.readFileSync (fs.js:397:15) at Object.create.all.get (/usr/local/lib/node_modules/bower/lib/node_modules/configstore/index.js:35:26) at Object.Configstore (/usr/local/lib/node_modules/bower/lib/node_modules/configstore/index.js:28:44) at readCachedConfig (/usr/local/lib/node_modules/bower/lib/config.js:19:23) at defaultConfig (/usr/local/lib/node_modules/bower/lib/config.js:11:12) at Object.<anonymous> (/usr/local/lib/node_modules/bower/lib/index.js:16:32) at Module._compile (module.js:410:26) at Object.Module._extensions..js (module.js:417:10)   Solution : sudo chown -R $USER:$GROUP ~/.npm sudo chown…
Read more

Solved : Could not determine java version when running gradle

Error : 21:04:36.791 [INFO] [org.gradle.internal.nativeintegration.services.NativeServices] Initialized native services in: /home/ubuntu/.gradle/native 21:04:36.817 [ERROR] [org.gradle.internal.buildevents.BuildExceptionReporter] 21:04:36.819 [ERROR] [org.gradle.internal.buildevents.BuildExceptionReporter] FAILURE: Build failed with an exception. 21:04:36.819 [ERROR] [org.gradle.internal.buildevents.BuildExceptionReporter] 21:04:36.819 [ERROR] [org.gradle.internal.buildevents.BuildExceptionReporter] * What went wrong: 21:04:36.822 [ERROR] [org.gradle.internal.buildevents.BuildExceptionReporter] Could not determine java version from ‘9.0.4’. 21:04:36.822 [ERROR] [org.gradle.internal.buildevents.BuildExceptionReporter] 21:04:36.822 [ERROR] [org.gradle.internal.buildevents.BuildExceptionReporter] * Exception is: 21:04:36.823 [ERROR] [org.gradle.internal.buildevents.BuildExceptionReporter] java.lang.IllegalArgumentException:…
Read more

ERROR when writing file to S3 bucket from EMRFS enabled Spark cluster

ERROR : 18/03/02 01:42:17 INFO RetryInvocationHandler: Exception while invoking ConsistencyCheckerS3FileSystem.mkdirs over null. Retrying after sleeping for 10000ms. com.amazon.ws.emr.hadoop.fs.consistency.exception.ConsistencyException: Directory ‘bucket/folder/_temporary’ present in the metadata but not s3 at com.amazon.ws.emr.hadoop.fs.consistency.ConsistencyCheckerS3FileSystem.getFileStatus(ConsistencyCheckerS3FileSystem.java:506)   Root cause : Mostly the consistent problem comes due to Manual deletion of files and directory from S3 console retry logic in spark and hadoop…
Read more

Amazon Aurora MySql Commands line

Connecting to a Database on a DB Instance Running the MySQL Database Engine Once Amazon RDS provisions your DB instance, you can use any standard SQL client application to connect to a database on the DB instance. In this example, you connect to a database on a MySQL DB instance using MySQL monitor commands. One…
Read more

Hello world!

Welcome to WordPress. This is your first post. Edit or delete it, then start writing!

There’s a voice that keeps on calling me

Ulysses, Ulysses – Soaring through all the galaxies. In search of Earth, flying in to the night. Ulysses, Ulysses – Fighting evil and tyranny, with all his power, and with all of his might. Ulysses – no-one else can do the things you do. Ulysses – like a bolt of thunder from the blue. Ulysses…
Read more

80 days around the world

80 days around the world, we’ll find a pot of gold just sitting where the rainbow’s ending. Time – we’ll fight against the time, and we’ll fly on the white wings of the wind. 80 days around the world, no we won’t say a word before the ship is really back. Round, round, all around…
Read more

Create and Insert to Hive table example

Create table : hive> CREATE TABLE students (name VARCHAR(64), age INT, gpa DECIMAL(3, 2)); OK Time taken: 1.084 seconds List tables : hive> show tables; OK students values__tmp__table__1 Time taken: 0.023 seconds, Fetched: 2 row(s) Describe table : hive> describe students; OK name                 varchar(64) age    …
Read more

Create and Insert to HBase table example

Login into master node : [ec2-user@ip-123-45-67-89 ~]$ sudo hbase shell HBase Shell; enter ‘help<RETURN>’ for list of supported commands. Type “exit<RETURN>” to leave the HBase Shell Version 1.3.1, rUnknown, Fri Sep 22 21:28:57 UTC 2017 hbase(main):001:0> list tables Create table : Command stntax create ‘<table_name>’, ‘<column_family>’ Example hbase(main):004:0> create ’employee_hbase’, ‘cf1’ Insert data into above…
Read more

Where is emrfs-site.xml ?

The emrfs-site.xml is being create if the EMRFS is enabled when creating the EMR in AWS. You can manage other related configurations by logging into the master node and in the following location, [ec2-user@ip-123-45-67-89 ~]$ ls -ltr /usr/share/aws/emr/emrfs/conf/emrfs-site.xml -rw-r–r– 1 root root 609 Feb 6 21:59 /usr/share/aws/emr/emrfs/conf/emrfs-site.xml [ec2-user@ip-123-45-67-89 ~]$   [ec2-user@ip-123-45-67-89 ~]$ cat /usr/share/aws/emr/emrfs/conf/emrfs-site.xml <?xml version=”1.0″?>…
Read more

Exception when creating hive table from hdfs parquet file

Problem FAILED: SemanticException Cannot find class ‘parquet.hive.DeprecatedParquetInputFormat’ Solution [hadoop@ip-123-45-67-890 extjars]$mkdir extjars [hadoop@ip-123-45-67-890 extjars]$cd extjars/ Now Download required jars: [hadoop@ip-123-45-67-890 extjars]$for f in parquet-avro parquet-cascading parquet-column parquet-common parquet-encoding parquet-generator parquet-hadoop parquet-hive parquet-pig parquet-scrooge parquet-test-hadoop2 parquet-thrift do curl -O https://oss.sonatype.org/service/local/repositories/releases/content/com/twitter/${f}/1.2.4/${f}-1.2.4.jar done curl -O https://oss.sonatype.org/service/local/repositories/releases/content/com/twitter/parquet-format/1.0.0/parquet-format-1.0.0.jar   [hadoop@ip-123-45-67-890 extjars]$ ls -ltr total 5472 -rw-rw-r– 1 hadoop hadoop 891821 Dec…
Read more